While fearless innovation and challenging the status quo are crucial to advancing healthcare, "there is such a thing as too much unconventional thinking," John Halamka, MD, president of Mayo Clinic Platform, wrote recently.
He cited two recent federal government decisions. CMS proposed a rule to avoid discrimination in the use of clinical decision tools and tech innovation, as Dr. Halamka wrote in the Oct. 7 blog post with Paul Cerrato, senior research analyst and communications specialist for Mayo Clinic Platform.
As CMS wrote, "covered entities may choose to establish written policies and procedures governing how information from clinical algorithms will be used in decision-making; monitor any potential impacts; and train staff on the proper use of such systems in decision-making."
Meanwhile, the FDA released a proposed rule on the regulation of healthcare artificial intelligence, giving a framework to determine which uses will be considered medical devices. According to the analysis by Dr. Halamka and Mr. Cerrato, "non-device examples can display, analyze, or print" certain medical information that is not "images, signals, or patterns"
They said clinical decision support tools considered software as a medical device — and thus subject to stricter regulation — include continuous glucose monitoring systems, computer-aided detection and diagnosis, medical images, waveforms such as electrocardiograms, and apps that give risk scores for a disease.
"The rules provided by CMS and FDA remind us all that advances in healthcare informatics and clinical care require a delicate balance between crazy and cautious," Dr. Halamka and Mr. Cerrato wrote.
They also pointed to an Oct. 6 paper from the Coalition for Health AI, of which Mayo Clinic Platform is a member, seeking inputs on its framework to prevent algorithmic bias.