Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

Medical Transcribers Must Eschew AI to Save Lives, Says Alphabet Transcription Services Founder

Reports have found error rates of just over 23% when using AI transcription, which is too high when human lives are involved.

Herts-based Alphabet Transcription Specialists are bucking the trend of AI-driven transcription processes to ensure life-saving accuracy for their clients in the healthcare industry. With 25 years in the transcription industry, serving both pharmaceutical and medical clients, Alphabet founder Denise Elsdon sounds the alarm against transcription services that have switched to AI-generated transcription, particularly in the healthcare sector. She states, “What many healthcare providers don’t realize is how bad the AI software is at producing accurate transcripts. Unfortunately, lots of AI software companies are offering this service and transcription companies are jumping on the bandwagon. And, when you pair these inaccuracies with people whose native language isn’t English, it’s a minefield.”

Recommended AI News: Rising Network Automation Unlocks Massive Growth Opportunities Across 5G

Research bears out her insights. According to this article, AI systems rely on training data sets to “learn” how to transcribe conversations. If that data comes with a bias towards a particular accent, the resulting transcription will be less accurate. The authors advise healthcare providers, “If the data collected is biased, that is, it targets a particular race, a particular gender, a specific age group, then the resulting model will be biased.”

The pharmaceutical industry is changing daily and new drugs and regulations are constantly being introduced. Vast knowledge and expertise are required to understand this specialized industry. New data needs to be introduced to the AI regularly and, again, not be biased between the different races, genders, and accents. And for AI to maintain its integrity, this data would need to be brought in from different regions throughout the UK, allowing for all dialects to be learned and maintained.

Even more telling is a TechCrunch article that reports the progress speech recognition company Speechmatics has made in transcribing accented speech. As writer Devin Coldewey points out, Speechmatics’ latest AI-powered transcription model achieved only an 82.8% accuracy rate when analyzing Black people’s voices. That’s simply not acceptable when lives are at stake, as in pharmaceutical — and even legal — transcriptions, according to Denise. “Because we deal with a lot of pharmaceutical clients, we’ve found that drug names and diseases often aren’t spelt correctly in AI-generated transcriptions. They even miss whether a drug is branded or generic. And some of our clients request some parts of their transcriptions verbatim, which is even more of a challenge for AI transcription models,” she adds.

Recommended AI News: FoundriesFactory Supports Arm’s Project Cassini for Secure Production Edge Deployments

Related Posts
1 of 41,065

That 82.8% rate applies only to African-American accents. Even a leading-edge model like Speechmatics, the TechCrunch article admitted, could only make “small but significant improvements” for Southern African, Scottish, Filipino, and Indian accents. With the UK’s National Health Service (NHS) reporting that nearly 15% of NHS staff hail from nations other than the UK, this should sound the alarm among healthcare providers considering AI transcriptions, Denise points out.

These are not the only studies showing inaccuracy in artificial intelligence transcription. A 2010 study found error rates of just over 23% when using AI for transcription. It also goes on to state that AI just cannot keep up with spontaneous human speech and utterings and transcribe them with certainty. This margin of error is just too large when dealing with human lives.

Recommended AI News: Rising Network Automation Unlocks Massive Growth Opportunities Across 5G

With different dialects and jargon being used, not only in different countries within the United Kingdom but also in different regions within the individual countries in the UK, it’s easy for the AI to misinterpret comments a human would easily understand. The acronyms alone vary within the pharmaceutical industry, and that is not something that AI can currently determine.

This is best summed up by Denise Elsdon with her perspective on the dangers of using AI translation software for pharmaceutical applications. She says, “All in all, there is nothing more accurate than the human ear and thorough research. These factors are all lacking in speech recognition software. For that reason, Alphabet Transcription Specialists will continue to employ expert human transcriptionists to deliver pinpoint accuracy to our pharmaceutical clients.”

Recommended AI News: Complete Verkada Platform is Now Available Across the UK and Europe

[To share your insights with us, please write to sghosh@martechseries.com]

Comments are closed.