Company resolves AI ad dispute with Texas AG

An AI company that works with health systems has reached an agreement with Texas Attorney General Ken Paxton to resolve allegations that it issued inaccurate and deceptive claims regarding the reliability and safety of its products.

AI health tech company Pieces Technologies' generative AI tool is capable of providing summaries of patients' conditions and treatments for hospital staff. The company works with four hospitals in Texas, according to a Sept. 18 news release.

An investigation by Mr. Paxton's office revealed that Pieces made misleading claims about the accuracy of its healthcare AI products, potentially endangering the public interest. According to the news release, Pieces created various metrics to promote its AI tools as "highly accurate," advertising an error rate or "severe hallucination rate" of "<1 per 100,000" in its marketing efforts.

Mr. Paxton's investigation concluded that these metrics were likely inaccurate and may have misled hospitals regarding the true accuracy and safety of the company's products.

As part of the agreement, Pieces has committed to providing transparent and accurate information about the accuracy of its products. The company has also agreed to ensure that hospital staff using its generative AI tools for patient care are fully informed about the appropriate level of reliance on these products.

"We are extremely disappointed in the Texas Office of the Attorney General's press release that dangerously misrepresents the Assurance of Voluntary Compliance into which Pieces entered," a statement from Pieces shared with Becker's reads. "Importantly and as noted specifically in the AVC, Pieces vigorously denies any wrongdoing and believes strongly that it has accurately set forth its hallucination rate, which was the sole focus of the AVC."

Pieces stated that it believes the Attorney General's press release misrepresents the terms of the AVC.

According to the company, the AVC does not mention any safety concerns regarding Pieces' products, nor is there evidence that the public interest has ever been at risk. 

"The AVC focuses solely on the company's reporting of hallucination rates in the context of an independently developed risk-classification system that is based on severity," Pieces statement reads. "Importantly, there is no industrywide risk classification system for generative AI hallucinations for inpatient clinical summarization that exists today."

Copyright © 2024 Becker's Healthcare. All Rights Reserved. Privacy Policy. Cookie Policy. Linking and Reprinting Policy.

 

Featured Whitepapers

Featured Webinars