March 22, 2024
Healthcare Embraces AI as Industry Leaders Seek Support for Clinicians
In sessions at ViVE 2024, health IT leaders discussed artificial intelligence use cases and considerations for healthcare organizations to ensure quality, buy-in and ease of adoption.
Health IT leaders at ViVE 2024 agreed that artificial intelligence is here to stay, though opinions differed about how long it will take healthcare organizations to adopt AI tools that meaningfully impact care outcomes. While questions around bias, privacy and liability remain for healthcare organizations considering adoption, the industry seems to be embracing AI for to its potential to mitigate clinician burnout and improve patient experiences and outcomes.
“AI’s productivity revolution is going to continue to impact healthcare. Providers and clinicians can turn their chairs around and spend more time with patients and less time clicking through the electronic medical record,” said Christine Swisher, chief scientific officer for Project Ronin, adding that AI can also support early detection of adverse events and disease.
“Models are helping us get a jump on which patients need attention quickest,” said Dr. Mark Sands, associate chief medical officer and vice president of clinical transformation and improvement at Northwell Health. “It gives clinicians more actionable information sooner.”
Dr. Clark Otley, chief medical officer for Mayo Clinic Platform at Mayo Clinic, said he’s seeing an expansion of AI in specialties such as cardiology. AI can analyze patient data and predict who could have heart failure in the next two to three years. This allows care teams to intervene with preventive care strategies.
The possibilities are endless, but to get there, healthcare organizations must address concerns about quality, bias, and patient safety and privacy through governance, stakeholder engagement and balancing innovation with intentional planning and transparency.
Setting Up Healthcare Organizations for AI Success
Physicians are stressed by the realities of healthcare, according to Otley, who said the industry needs to work together to make AI simple to ease clinical workflows.
“There are some skeptics. Any product you introduce is going to come with a learning curve,” said Sands. “Anything that changes clinicians’ workflows is going to be problematic. Many things die on the altar of workflow. If you pay attention to how everyone works, you’re going to see the biggest return.”
Jayme Strauss, chief clinical officer at Viz.ai, explained that it’s important to work with users on the ground and map the end-to-end workflows of clinicians, technicians and everyone involved to understand where the patient touches the care team, then build around that.
However, Swisher pointed out that a major barrier for AI adoption in healthcare is the question of whether AI can be used without the fear of harming those that healthcare aims to heal. She said the industry needs standards, guidelines and best practices that can be adopted in a practical way. Before implementing AI solutions, healthcare organizations should consider patient safety, transparency, data protection, bias mitigation and the usefulness of the tool.
Panelists pointed out that AI companies and organizations need to collaborate on data sharing and de-identification to drive innovation safely. Otley emphasized the need for governance as healthcare organizations move quickly toward AI adoption. Having thoughtful governance in place can enable health systems to move as fast as possible on AI without getting over their skis when it comes to patient privacy.
The stakes in healthcare are high because organizations are dealing with patients’ lives and sensitive data. Otley and Swisher agreed that the industry needs to develop regulatory frameworks, and assurance labs to test those frameworks.
“If you think about the incredible network of clinical-trial organizations focused on drug and device trials, we need that for digital healthcare trials because it is personalized,” said Otley, adding that they should be tested in clinically balanced settings. Assurance labs are being promoted by the College of Healthcare Information Management Executives and the Food and Drug Administration to put digital healthcare solutions through rigorous analysis in conjunction with the frameworks created.
Before ChatGPT was introduced, Swisher said AI companies were successful when focused on fitting into clinical workflows, solving meaningful challenges, bringing real-world value as defined by the Quadruple Aim, reducing healthcare costs and improving the clinician experience and patient outcomes. She believes those definitions of success still apply today.
To prevent biased algorithms, Strauss said, there’s a need for social determinants of health data collection to ensure that the data is representative of diverse patient populations. If that data isn’t collected, she said, then it could lead to health equity failures. Ensuring transparency into the data used to train algorithms — and the communication of that information with clinicians — is another important step for the success of AI adoption.
The Importance of Measuring the ROI of AI Solutions in Healthcare
As healthcare organizations begin to implement AI solutions, it’s crucial that innovation leaders consider how to best measure ROI.
Susan Pasley, chief nursing officer at CareRev, said AI should reduce administrative burdens, make diagnosing patients a quicker process, and increase the number of patients a physician can see.
“What is the problem you’re trying to solve? Every company is coming out with an AI-supported solution,” she said, adding that healthcare leaders need to make sure the solution they implement addresses a particular problem. “That’s how to measure the ROI.”
Sunil Dadlani, executive vice president and chief information and digital transformation officer at Atlantic Health System, agreed. He explained that healthcare leaders tend to get “shiny technology toy” syndrome, getting excited about a solution and then trying to figure out which problem it might fix. However, he said, it’s important to define the problem first before starting a proof-of-value or proof-of-concept trial and to identify key performance indicators for an AI solution.
Kyna Fong, Elation Health CEO and co-founder, reminded the audience that it’s also important to consider who gets the benefit of AI’s ROI. For example, if AI can give clinicians an extra hour or two in their day, then that’s a huge return. Fong said AI can lower costs, improve quality and make healthcare workflows faster and more efficient.
Moderator Dr. John Whyte, chief medical officer at WebMD/WebMD Ignite, asked where healthcare organizations should be focused when considering AI adoption.
“Where do returns accrued drive adoption? As far as where we’re focused, we drive adoption through our customers, who are driven by clinicians,” Fong replied. “For us, the ROI is, how do we reduce clinical and administrative burden with the expectation that that then drives patient care? The first principle of a healthcare system is to take care of patients. It’s important from a community and healthcare leadership perspective to think about the impact on patients at the end of the day, regardless of which stakeholder we’re supporting. If we keep that in mind, I think that’s a great North Star for the application of AI.”
Whether an organization chooses to focus on patient experience and engagement or operational efficiencies, Dadlani said, the organization needs to have digital literacy and maturity to harness the power of the technology while keeping patient safety and outcomes as the No. 1 priority.
He explained that the majority of AI proofs of concept in healthcare can’t scale to an enterprise level, and he recommended three best practices to prove the value of AI solutions:
- Create a safe and secure environment to implement concepts
- Ensure the organization is using de-identified and anonymized data
- Have technology maturity within the organization to create a modular architecture
Organizations also need to consider how they can measure success to prove the AI solution’s value, what it takes to scale at an enterprise level, what it costs, whether the organization has the right level of digital maturity and literacy across the system, and whether the right stakeholders are involved.
Fong emphasized that, while organizations need to ensure they have the right governance structures in place to support AI adoption, the biggest obstacle to AI implementation in healthcare is the technology itself. She noted that the role of AI companies is to create solutions that people will adopt and use. These solutions ultimately need to be designed in a way that works within healthcare workflows and can be trusted by the organization and its end users to support patient care goals.
“As technology companies, we should rise to that challenge and deliver what works,” she said.
Story by Jordan Scott, an editor for HealthTech with experience in journalism and B2B publishing.