May 31, 2023

UAB experts say AI-assisted medical documentation and reporting will absolutely impact patients in the near future

Written by

Environmental headshot of Dr. Andrew Smith, MD (Professor, Radiology), February 2022.Andrew Smith, M.D., Ph.D.Andrew Smith, M.D., Ph.D., vice chair of innovation in the Department of Radiology and clinical director of AI in the Marnix E. Heersink Institute of Biomedical Innovation, recently published a study on the use of ChatGPT in Radiology with Asser Abou Elkassem, M.D., a Department of Radiology resident.

Large language models (LLMs) like ChatGPT are advanced artificial intelligence (AI) systems designed to understand and generate human-like responses. This AI system can be especially helpful in the health care field for both patients and health care providers. LLMs such as ChatGPT offer a promising avenue for enhancing patient engagement, as they aid radiologists in improving reporting accuracy and efficiency.

“ChatGPT can assist patients by providing information, explaining medical terms and procedures, suggesting questions for healthcare providers, and addressing common questions and concerns,” said Abou Elkassem.

ChatGPT can also reduce omissions and cognitive load for health care providers as well as communicate layperson reports at different educational levels in over 40 different languages. This would allow patients to develop an enhanced understanding of their medical conditions and work with their health care providers to co-design a personalized care plan tailored to their individual needs and preferences.

Despite its many positive attributes, ChatGPT has produced several points of concern, such as hallucinations, biases, task accuracy in medical contexts, issues with data security, and training and integration into health care systems.

4 Asser Abou Elkassem M.D.ChatGPT and other LLMs are trained on vast amounts of internet text and can be further sub-trained for certain tasks to improve accuracy. However, any biases in the training dataset will be absorbed by the AI system, making it inherently biased.

ChatGPT can generate hallucinations, where the AI generates information by filling in information gaps with fictitious details. This is especially concerning for patient-centered care, where the hallucinations are often presented with great confidence and could lead to incorrect decision making by the healthcare provider.

Furthermore, LLMs often have difficulty contextualizing medical information and lack abstract reasoning capability. As such, key aspects of a summary generated by ChatGPT would need validation by the radiologist.

“At present, ChatGPT has not been sufficiently trained in medical knowledge, but this is anticipated to occur imminently,” said Smith. “ChatGPT will be integrated into health care tools like electronic medical records, and health care providers will need to be trained to use it.”

Although there are some hindrances, there is hope for ChatGPT’s successful implementation with domain-specific training and validation for AI systems in health care.

“AI-assisted medical documentation and reporting will absolutely impact patients in the near future,” said Smith. “LLMs like ChatGPT will improve patient communication, medical documentation, and health care efficacy, standardization, and efficiency.”

Smith and Abou Elkassem’s research of ChatGPT in Radiology can offer insight into the AI system’s larger implications for other medical specialties and scientific fields of study.

“These advancements have the potential to drive the adoption of AI-powered technologies across various medical specialties, leading to enhanced diagnostics, improved patient care, and more informed decision-making,” said Abou Elkassem.