Hospital ghosts? Report says Microsoft's BioGPT riddled with inaccuracies

Microsoft's BioGPT medical generative artificial intelligence tool is riddled with inaccuracies and misinformation that could be dangerous to patients, Futurism reported.

When the news outlet asked the large language model how many ghosts haunt U.S. hospitals, it cited made-up data from the American Hospital Association finding that the "average number of ghosts per hospital was 1.4" and that patients "who see the ghosts of their relatives have worse outcomes while those who see unrelated ghosts do not."

It also falsely claimed that the CDC says vaccines may cause autism, according to the March 7 story.

"BioGPT is a research project," Microsoft Health Futures senior director Hoifung Poon, PhD, told the news outlet. "We released BioGPT in its current state so that others may reproduce and verify our work as well as study the viability of large language models in biomedical research."

Roxana Daneshjou, MD, PhD, a clinical scholar at Palo Alto, Calif.-based Stanford University School of Medicine, told Futurism that generative AI tools like BioGPT are "trained to give answers that sound plausible as speech or written language" but are "not optimized for the actual accurate output of the information."

"My biggest concern is just seeing how people in medicine are wanting to start to use this without fully understanding what all the limitations are," she said.

Copyright © 2024 Becker's Healthcare. All Rights Reserved. Privacy Policy. Cookie Policy. Linking and Reprinting Policy.

 

Articles We Think You'll Like

 

Featured Whitepapers

Featured Webinars