In health care, augmented intelligence (AI)—often referred to as artificial intelligence—can be used to make effective changes such as reducing physician burnout or lowering patient wait times in emergency departments.
But if a health care organization is seeking to integrate AI into its workflows just so that it can say it is using AI—their time might be better spent creating videos of otters using Wi-Fi on an airplane.
“I don't bring AI as a bright-and-shiny object and try to sell it to somebody,” said Brett Oliver, MD, a family physician and chief medical information officer for Louisville, Kentucky-based Baptist Health Medical Group.
“What we're doing is using AI like any other tool that would be out there in the sense that we're trying to solve problems, go after opportunities, things like that,” Dr. Oliver said. “So we don't just do AI to do AI.”
And while otter-on-a-plane videos may have no health care applications, they might be able to help persuade someone who tried and abandoned AI in the past to reconsider giving it another chance.
“It’s a goofy example, but—around a year and a half ago—if you put a prompt into ChatGPT that said, ‘Give me a picture of an otter using Wi-Fi on a plane,’ you’d get these dysmorphic images that I guess you could maybe say was an otter,” Dr. Oliver said.
“Then fast forward just 15 months and there's this amazing 30-second video of an otter walking onto a plane, getting on, flipping open his laptop, cracking a Coke, putting in a straw and getting to work,” he added.
The idea is to encourage people who did not want to try or use AI based on a negative experience a year or two ago that it may be time for them to reevaluate their position.
Baptist Health Medical Group is part of the AMA Health System Member Program, which provides enterprise solutions to equip leadership, physicians and care teams with resources to help drive the future of medicine.
Looking for a break
One example of how Baptist Health employs AI to solve a problem is using BoneView, a fracture-detection program to read X-rays to lower wait times in the emergency department at night.
“We have off-site radiologists that read at night, but they get backed up and plain X-ray films aren't their priority—the more complex studies are higher up on the list,” Dr. Oliver explained.
“Sometimes we've got patients waiting two and three hours just to get a read on their X-ray,” he added. “Now with this AI tool, the ED physicians are comfortable enough to say, ‘I can give you a disposition based on the AI that's looking at this X-ray.’”
A radiologist will then review the findings in the morning.
“So we were trying to help solve a problem, and not just say, ‘Hey, here's some cool software that can help you find fractures sooner,’” Dr. Oliver said.
“On plain X-rays, it's going to look for bony fractures, dislocations, effusions, things like that,” he added. “And we have that in three of our hospitals, but we didn't just turn that live.”
The process started with a training period that ensured that “everybody from the radiology directors as well as the radiologists knew that it was coming and what it involved,” Dr. Oliver said, adding that each had input in the workflow.
Feedback was also received about where physicians want to see the data, how they wanted it presented and where they wanted it included in their note.
“That's always a part of that piloting process that we try to do,” Dr. Oliver said.
Improving the patient experience
Baptist Health was an early adopter of the DAX Copilot note-documentation tool that uses ambient AI scribes to transcribe patient-physician interactions and then, using machine learning and natural-language processing, it summarizes the clinical content of the discussion. It then produces a clinical note documenting the visit—though physician review remains a critical step in this process.
The DAX Copilot was targeted at easing physicians’ documentation burdens.
“We had folks staying at offices late,” Dr. Oliver said. “From a physician burnout perspective, it's been extremely helpful.”
But other benefits have been detected as well, with 86% of physicians reporting an improvement in the patient experience, according to a recent in-house survey.
“I've used the tool myself and to be able to just look your patient in the eye and talk to them for the entire visit—instead of constantly looking at the keyboard—is really, really nice,” Dr. Oliver said. “From a human-behavior standpoint, if you're making more eye contact, people tend to think you're not only more engaged, but you spend more time with them.”
He added that he has heard “amazing stories” from physicians and nonphysician clinical staff who said they were on the verge of leaving medicine but have decided to keep practicing since integrating the documentation tool into their workflow.
“We never wanted to go into medicine to spend more than half of our day in primary care in front of the computer—but that's unfortunately kind of where it evolved to,” Dr. Oliver said.
From AI implementation to EHR adoption and usability, the AMA is fighting to make technology work for physicians, ensuring that it is an asset to doctors—not a burden.
In June, delegates at the 2025 AMA Annual Meeting took several actions to strengthen existing AMA policy (PDF) and ensure that the technology is “explainable,” validated, well defined and is not used to conduct medical research fraud.
Getting feedback, offering training
The AMA surveyed almost 1,200 physicians (PDF) about health AI last fall and found that they are largely enthusiastic about health AI’s potential, with 68% seeing at least some advantage to the use of AI in their practice, up from 65% in 2023. Meanwhile, the share of physicians using some type of AI tool in practice rose from 38% in 2023 to 66% in 2024. However, the physicians surveyed said there are steps that health care organizations should take to boost their trust in using health AI.
For example:
- 88% of physicians say there should be a dedicated channel for feedback should issues arise.
- 85% want privacy assurances.
- 84% want to get proper training.
- 84% want AI to be integrated into the EHR workflow.
Dr. Oliver agreed that feedback is very important. Physicians are surveyed after a pilot to learn what worked and what did not. With established programs, such as AI-generated responses to patient inquiries, physicians can review the responses and give them a thumbs up or thumbs down.
“That's a small thing, they could add comments if they want,” he said. “We present opportunities to engage at different levels, whether it's the service line, or through our support team and through our informatics team.”
Dr. Oliver noted there may be a “disconnect” between a hypothetical desire for more training as part of AI tool implementation and the reality.
“We put together about a 25-minute introduction to AI video—pretty well done, honestly,” he said adding that the video was posted with their internal learning modules and that, after discussions about whether to make viewing mandatory, it was decided to make it voluntary.
“After about eight weeks—other than my physician informaticists team, which is a small group of five—no physicians took it,” Dr. Oliver said. ()“I think as it gets more real for them, as it starts becoming maybe more prevalent, we'll get a little bit more.”
He added that more than 200 nonphysicians on staff did take the module.
Unusually rapid acceptance seen
Dr. Oliver noted that large language model generative pre-trained transformers (GPTs) have only been in existence for about eight years, but there has been rapid uptake of the technology in medicine.
“It is amazing to me how quickly it’s been accepted,” Dr. Oliver said.
“The speed of change in health care is usually like a battleship or a cruise ship, not a speedboat,” he added “It takes us a long time to turn, but I'm seeing the willingness to turn in all areas of medicine much more readily—which is refreshing.”
But it is also cause for concern. A key to ensuring privacy and security “is making sure that we know what we have,” Dr. Oliver said, and processes are in place before any new IT tool or equipment is added to the system’s network.
What worries him is that a vendor will add an AI component to existing equipment without informing him and telling the equipment’s end user “Hey, we’ve got this new AI thing and we’re going to add it on to the network on Monday—you don’t need to do anything.”
“That's the thing that sometimes keeps me up at night,” Dr. Oliver said. “We've got to raise the literacy level of the entire organization. I need that person—physician or nonphysician—to say ‘No, wait a minute, has that been through our process?’”
Learn more with the AMA about the emerging landscape of health care AI. Also, explore how to apply AI to transform health care with the “AMA ChangeMedEd® Artificial Intelligence in Health Care Series.”