As health systems navigate tightening budgets, workforce shortages, growing patient volumes driven by market consolidation, and rising expectations for patient outcomes, the promise of artificial intelligence (AI) has never been more compelling—or more complex.
At the 2025 HIMSS AI in Healthcare Forum, our CTO, Chris Gervais, shared key findings from a recent HIMSS Market Insights survey on AI adoption in clinical settings. Conducted in late May and June 2025, the research gathered insights from 125 qualified clinicians and clinical leaders across U.S. hospitals and health systems. The results offer a clear snapshot of where health systems stand today—and what it will take to move from pilot programs to meaningful, system-wide impact.
According to the findings, only 9% of hospitals have implemented AI system-wide, while most remain in pilot phases or stuck in isolated use cases.
Gervais noted that the difference often comes down to strategic intent: successful health systems “invest not just in technology, but in rethinking the core processes it’s meant to improve.”
Before choosing an AI vendor or tool, define your “why.” Are you aiming to reduce diagnostic turnaround times? Improve revenue cycle accuracy? Enable real-time decision support? AI should serve your goals—not the other way around.
AI is only as powerful as the data it’s trained on. Standardization, accessibility, and real-time usability are key—but data alone isn’t enough. As Gervais noted, “AI won’t magically fix your existing messy processes.” If workflows are broken or siloed, AI will only amplify those inefficiencies. We’re swimming in data. But unless it’s structured, codified, and shareable, we can’t do anything with it.
Health systems that are further along in their AI journey recognize this. They’re investing in foundational data infrastructure to unlock more than just billing use cases—enabling everything from predictive alerts to population health insights.
Clinician skepticism remains high—and rightly so.
The research showed that:
Gervais emphasized that buy-in requires intentional effort, noting that “trust is hard to earn and easy to lose,” especially if teams feel left out of the process or unsure of how AI decisions are made.
To build lasting confidence, education has to go beyond the technical. Health systems need to help clinical and operational teams understand how AI enhances their expertise rather than replacing it.
Only 28% of clinical executives surveyed believe their organizations are ready to handle legal and compliance risks associated with AI. And yet—19% of organizations surveyed have no formal AI governance structure in place.
Gervais warned that adopting AI without clear oversight isn't just risky—it’s ill-advised. As he put it, “If you’re adopting AI without a governance framework, you’re not managing risk. You’re multiplying it.”
Effective governance means engaging legal and compliance teams early, defining ethical guardrails, and holding vendors accountable for transparency, bias mitigation, and data security.
No single department can drive AI transformation alone. It requires executive sponsorship, clinician engagement, IT readiness, and operational alignment.
Health systems that succeed with AI:
As health systems push forward, the question isn’t just how to scale AI—but what they’re scaling. As Chris Gervais summed it up, “AI won’t fix your processes—it will amplify them. Make sure what it’s amplifying is worth scaling.”