A Reputation Built By Success

The pros and cons of doctors letting AI take notes for them

While artificial intelligence (AI) can improve our lives in countless ways, one of the inherent risks is that professionals can rely on it to do too much. Without realizing it, they can also allow it to put distance between them and the people they’re serving.

One place where that’s true is in the medical field. For example, The Ohio State University Wexner Medical Center recently piloted an AI program that takes notes during outpatient clinic visits. This frees up doctors to better listen and connect with patients while they’re talking. It can also capture everything that’s said to them (and what they say), which is something that human beings don’t always do.

Is the time saved used to connect with patients or to just see more of them?

This application of AI can save doctors (and their employers) time since they don’t have to worry about typing these notes. That could mean being able to spend more time with each patient. 

The doctor who led the pilot said, “We found it saved up to four minutes per visit. That’s time the physician can use to connect with the patient, do education and make sure they understand the plan going forward.” 

Of course, medical facilities might be more interested in how much it can shorten appointment times and allow doctors to see more patients in a day. The AI note-taking tool is now available to OSU doctors for their outpatient appointments. The doctors reported saving a total of 64 hours in the initial two weeks. Notably, they also found that patient satisfaction scores increased.

The risks of relying on AI-generated notes

While getting AI-generated patient notes can certainly save doctors time and keep their hands free to better examine and diagnose patients, it’s critical that doctors review those notes immediately to make sure there are no substantive errors or omissions. Not all doctors are going to do that. This could mean that erroneous information ends up in a patient’s file or that something they said that could be key to a correct diagnosis is missed.

No matter how much they rely on AI, doctors are typically still responsible for the mistakes that occur because they failed to make sure the information generated was complete and accurate. That means they can be held liable if their negligence causes harm to a patient. It’s important for patients to know their rights and their legal options.

Categories