Artificial intelligence has become one of the most hyped emerging technologies for the healthcare sector, spurring hopes it could help solve the heavy burden of administrative tasks on providers and manage vast troves of inaccessible data.
But implementing AI can be challenging for healthcare organizations, given the resources needed to put the technology in place and ensure it continues to work well.
On one hand, healthcare organizations have used some types of AI, like tools for helping radiologists interpret imaging results, for years, said Rob Havasy, senior director for informatics strategy at HIMSS. But the advent of generative AI tools, the public’s awareness of the technology and AI’s rapid advancement in recent years have added challenges for the healthcare sector.
“Organizations are doing well with it, but they’re running into a point where the pace of change is starting to overwhelm an already stressed workforce,” he said.
Here are three key points for health systems to consider as they roll out AI tools:
Know your metrics
Health systems need to decide what they’re trying to achieve when implementing an AI tool and figure out how they’ll measure outcomes, said Karla Eidem, regional managing director for North America at Project Management Institute.
Only about half of all projects attempted in the healthcare sector are considered successful, while another 38% have mixed results and 10% fail, according to a report published last year by PMI.
One of the main reasons projects don’t succeed is because healthcare organizations aren’t clear about how they define success, or they don’t have a system in place to evaluate those metrics, Eidem said.
And sometimes goals might conflict with one another. For example, an AI-based image analysis tool might improve clinical care at the cost of seeing fewer patients, Havasy said. So what’s the organization’s plan if it can’t have both?
“When the goals are not clearly defined, then you have this group of people navigating through complexity and lack of clarity,” Eidem said. “Then, of course, it derails everything else.”
Lean on informaticists, project managers
Utilizing a project manager can also help health systems deploy AI tools, Eidem said.
Project managers can sit on top of the workflow, translating the needs of different groups at hospitals who might have different priorities — like working across physicians, technologists, finance staff and attorneys, she said.
Plus, putting a project manager in place could make sure the job isn’t dropped onto a clinician who already has a mountain of other tasks.
“You give that project to a project champion, which is a physician, but now he has to talk to IT and legal and everything else,” she said. “And instead of being kicked around, you have a project professional in the middle that’s connecting the dots.”
In healthcare, those project management roles will often belong to an informaticist, who use technology and data to improve care delivery at healthcare organizations, Havasy said.
“After Meaningful Use and after we implemented EHRs, informaticists felt a little less valued inside hospitals, right? We put in a big platform, the implementation’s done,” he said. “Now that everything becomes a technical project, well, project managers and informaticists are suddenly popular again.”
Under-resourced providers need to consider training data
Implementing AI projects at under-resourced and safety-net organizations, which may lack the technical expertise and human labor needed for adoption, could be particularly challenging, experts say.
Some hospitals may already be falling behind. Data released this month by the Assistant Secretary for Technology Policy/Office of the National Coordinator for Health IT found small, rural, independent and critical-access hospitals are less likely to report using predictive AI tools.
The challenge might not be acquiring AI products, Havasy said. Many EHR companies are adding these tools to their offerings, so providers can pull tools from their vendor.
But these AI products likely won’t be trained on data similar to each provider’s patient population, which could mean they won’t perform as well. That means under-resourced providers should focus their efforts on making sure these tools fit their care environment — and continuing to monitor their performance to ensure they stay up to snuff, Havasy said.
“No AI tool survives first contact with real world data,” he said. “No matter how well it’s trained, it hasn’t been trained on this clinic’s population of patients.”