AI's no-fly zones: 5 executives weigh in

It is clear that healthcare leaders are engaged in the artificial intelligence space. 

A McKinsey & Co. survey released in June found nearly 90% of health system executives surveyed, in both technical roles (such as CIO or chief technology officer) and nontechnical roles (such as CEO or CFO), reported digital and AI transformation as a high or top priority for their organization. 

Health systems acknowledge the importance of these efforts to support operations and organizational strategy. However, they told Becker's they are approaching AI from a cautious standpoint, keeping in mind the importance of personal connection and human touch.

Below, five executives answer the question: What specific parts of healthcare delivery, operations and decision-making are best left to human judgment?

Gerard Colman, PhD. CEO of Baptist Health (Louisville, Ky.): Any final clinical decisions regarding patient care will not be a place for AI in its current state. At Baptist Health, we believe the final decision should always rest with a human clinician to ensure accountability and ethical considerations. Other sensitive areas where we believe that AI use should be approached with caution or even deemed off-limits would be the informed consent process, end-of-life care decisions and behavioral health diagnosis and treatment (given the nuanced and sensitive nature of care in this specialty).

Emergency department care is another area that AI could support triage, but should not make final decisions on patient treatment priorities given the fast-paced, often unpredictable nature of ED patient care requiring experienced human judgment.

Aidan Hettler. CEO of Sedgwick County Health Center (Julesburg, Colo.): In the context of value-based healthcare and relationship-centered care, certain aspects of healthcare delivery, operations and decision-making should remain off-limits for artificial intelligence to preserve the essential human elements that are crucial to patient outcomes and experiences. Personalized patient interactions, such as discussing difficult diagnoses or understanding a patient's emotional state, require empathy and trust that only human providers can offer. Similarly, ethical decision-making, especially in complex situations like end-of-life care or resource prioritization, necessitates human judgment that considers cultural nuances and moral dilemmas beyond what AI can provide.

Moreover, care coordination, which is vital in value-based healthcare, involves understanding a patient's personal context, including family dynamics and social support, which AI cannot fully grasp. Complex clinical cases also benefit from the experience and intuition of healthcare providers who can integrate AI input with their clinical knowledge to make the best decisions for the patient. Additionally, patient advocacy and navigation through the healthcare system are roles that require a human touch to address disparities and support patients in overcoming barriers. While AI can enhance healthcare delivery through data-driven insights, the human elements of care — such as empathy, ethical consideration and personalized support — are irreplaceable and should be safeguarded.

Dave Lehr. Chief Strategy Officer of Meritus Health and COO of the proposed Meritus School of Osteopathic Medicine (Hagerstown, Md.): In a field where there is a lot of technical subject matter, it is tempting to believe that the most important decisions can be enhanced by a highly intelligent artificial brain that exceeds human capability in reasoning. However, I would argue that the most important decisions in healthcare are not those that involve correct versus incorrect technical solutions. The most important medical decisions are deeply personal human choices that rest squarely on our own values, ethics and recognition of our mortality.

Here are some examples of the types of medical questions we see every day in a modern healthcare system:

  • At age 44, should I take on $60,000 of debt attempting to have a baby through IVF treatment with no guarantee of success?
  • My cancer has a 20% survival rate and I'm currently 71. Should I consider palliative care and focus my last days at home with family?

AI may very well exceed human capacity in most intellectual and technical subject areas relatively soon. But we should never outsource patient autonomy to it. There are certain things that make us who we are: individuals with the freedom to live according to our beliefs, hopes and apprehensions. Healthcare is the industry with the most intimate connection to these things. We should not allow the patient voice to be lost in a sea of AI generated recommendations that are disconnected from the patient's wishes.​

Amit Vashist, MD. Senior Vice President and Chief Clinical Officer of Ballad Health (Johnson City, Tenn.): Carefully conceptualized technologies, advanced data analytics, traditional machine learning tools and now deep learning models like Gen AI hold great promise to transform healthcare delivery and improve clinical outcomes. However, their success is predicated on the organizational culture that surrounds and supports these modalities with the ability of caregiver teams to weigh in and help formulate holistic solutions targeting overall outcomes rather than zooming in on very narrowly focused used cases. I am starting to get a somewhat jaded of this hyperbole of how Gen AI by itself will solve the most pressing healthcare issues plaguing us currently. We need to realize that at the end of the day, healthcare boils down to a very sacrosanct interaction between a patient and a clinical caregiver that puts the patient on a journey from sickness to wellness or from wellness to continued wellness.

At Ballad Health, as we carefully calibrate integration of Gen AI to our care delivery models and build up our AI governance structure, a ton of our work has been laser focused on putting back cultural and foundational pieces of good patient care like rolling out our high-reliability organization journey with emphasis on psychological safety, zero harm and just culture, addressing caregiver burnout and mitigating clinical variation.

Cheryl Nester Wolfe, RN. President and CEO of Salem (Ore.) Health Hospitals and Clinics: We see the potential for AI to enhance healthcare, as we work closely with our chief applications officer at Salem Health, who is also a registered nurse, to identify potential areas to engage with AI. At the same time, we believe it's crucial that we maintain the human touch in areas like end-of-life decisions, the patient-physician connection and ethical choices. We are proactive in evaluating where AI can be genuinely helpful, but also with a cautious eye, ensuring that it supports, rather than replaces, the empathy and trust that are essential in patient care.

Editor's note: This piece was updated on Aug. 16.

Copyright © 2024 Becker's Healthcare. All Rights Reserved. Privacy Policy. Cookie Policy. Linking and Reprinting Policy.

 

Articles We Think You'll Like

 

Featured Whitepapers

Featured Webinars