hospitals

OpenAI’s Transcription Tool Hallucinates. Hospitals Are Using It Anyway

OpenAI’s Transcription Tool Hallucinates. Hospitals Are Using It Anyway

On Saturday, an Associated Press investigation revealed that OpenAI's Whisper transcription tool creates fabricated text in medical and business settings despite warnings against such use. The AP interviewed more than 12 software engineers, developers, and researchers who found the model regularly invents text that speakers never said, a phenomenon often called a “confabulation” or “hallucination” in the AI field.Upon its release in 2022, OpenAI claimed that Whisper approached “human level robustness” in audio transcription accuracy. However, a University of Michigan researcher told the AP that Whisper created false text in 80 percent of public meeting transcripts examined. Another developer, unnamed…
Read More
No widgets found. Go to Widget page and add the widget in Offcanvas Sidebar Widget Area.