Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

Translated Debuts Trust Attention for Unprecedented Quality in MT, Paving the Way for Accuracy in Generative AI

The latest version of ModernMT (version 7) enhances translation quality by up to 42% using Trust Attention, a novel technique developed by Translated that links the origin of data to its impact on translation accuracy.

Translated, a leading provider of AI-powered language solutions, announced the launch of ModernMT Version 7, a significant upgrade to its adaptive machine translation (MT) system. The latest version introduces Trust Attention, a novel technique inspired by the human brain’s ability to prioritize information from trusted sources, improving translation quality by up to 42% (see attached graph). This innovation sets a new industry standard, moving away from traditional MT systems that are hampered by an inability to distinguish between trustworthy data and lower quality material during the training process.

Read More about AiThority InterviewAiThority Interview with Alistair Dent, Chief Strategy Officer at Profusion

ModernMT now uses a first-of-its-kind weighting system to prioritize learning from high-quality, qualified data – meaning translations performed and reviewed by professional translators – over unverified content from the Web. As it did when introducing adaptivity, Translated looked to the human brain for inspiration in developing this new technique. Just as humans sift through multiple sources of information to identify the most trustworthy and reliable ones, ModernMT V7 similarly identifies the most valuable training data and prioritizes its learning based on that.

“ModernMT’s ability to prioritize higher quality data to improve the model is the most significant leap forward in machine translation since the introduction of dynamic adaptivity five years ago,” said Marco Trombetti, CEO of Translated. “This exciting innovation opens new opportunities for companies to use MT to take their global customer experience to the next level. It will also help translators increase productivity and revenue.”

AiThority Interview Insights: How to Get Started with Prompt Engineering in Generative AI Projects

The introduction of this new approach is a major step forward for companies seeking greater accuracy when translating large volumes of content or requiring a high degree of customization of the MT engine, as well as for translators integrating MT into their workflow.

Today, there’s considerable discussion regarding the application of large language models (LLM) in translation. While traditional machine translation prioritizes accuracy over fluency, LLMs tend to emphasize fluency. This can sometimes result in misleading outputs due to hallucinations, where outputs aren’t grounded in the input received from training data. We believe that Translated’s Trust Attention can enhance the accuracy of generative models, reducing the chances of such errors. This could set the stage for the next era of machine translation.

All Translated clients will benefit from the improved quality of the new MT model, resulting in faster project turnaround times. Translators working with Translated will experience the power of the new model through Matecat, Translated’s free, web-based, AI-powered CAT tool. Translators using an officially supported CAT tool (Matecat, memoQ, and Trados) with an active ModernMT license will also experience the power of the new model.

 Latest AiThority Interview Insights : AiThority Interview with Gary Kotovets, Chief Data and Analytics Officer at Dun & Bradstreet

 [To share your insights with us, please write to sghosh@martechseries.com] 

Comments are closed.