Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

DEKRA and LatticeFlow Launch AI Safety Assessment Services

DEKRA, the largest not listed testing, inspection and certification (TIC) company, and LatticeFlow, the leading AI development and deployment platform for continuously improving model performance, today announced the formation of a strategic partnership to provide AI safety assessments for enterprise clients. The collective offering will deliver the industry’s first AI model assessment service, following the latest ISO standards for data quality and model robustness.

The new AI safety assessment service relies on DEKRA’s expertise in conducting independent, third-party safety and security assessments, and combines with LatticeFlow’s award-winning, scalable platform for evaluating and certifying a client’s AI data and models. The new service allows clients to obtain a comprehensive and unbiased evaluation of their AI data and models, to manage risks by identifying and eliminating gaps in data quality and model performance.

Read More about AiThority InterviewAiThority Interview with Thomas Kriebernegg, Managing Director & Co-Founder at App Radar

“In the past six months, the rate of AI adoption has significantly accelerated. This raised safety concerns by prominent AI researchers, who called for a temporary halt to large-scale AI experiments – all with the hope of ensuring more secure and responsible implementations,” said LatticeFlow’s CEO, Petar Tsankov. “Although a complete pause is likely to be infeasible, we believe that by working with DEKRA’s proven team, we will be able to help companies mitigate the risk associated with AI – and hasten the move towards safe AI adoption.”

AI Safety Assessments are the bridge to establishing trustworthy AI

Related Posts
1 of 41,050

Given this pressing need to help ensure progress in the accuracy, safety, transparency and reliability of AI systems, Fernando Hardasmal, DEKRA Executive Vice President, and Head of Digital & Product Solutions, and Petar Tsankov, CEO of LatticeFlow, signed the AI safety assessment partnership agreement on April 11th in Málaga, Spain.

The collaboration between DEKRA and LatticeFlow combines their unique expertise and technology. DEKRA, with nearly a century of experience in safety and security testing, and certification for top-tier enterprises, employs a proven methodology grounded in up-to-date standards for thorough assessments. LatticeFlow, established by a group of AI researchers and experts, has developed the first scalable platform for evaluating AI data and models, based on which the team has won the world’s first global competition for AI testing. Both teams are actively involved in the definition of the new standards for trustworthy AI through organizations including the International Organization for Standardization (ISO) and the European Commission.

AiThority Interview Insights: AiThority Interview with at Brian Sathianathan, Co-Founder and CTO at Iterate.ai

Xavier Valero, Director of Artificial Intelligence and Advanced Analytics at DEKRA, added: “The need to start ensuring the security, safety and reliability of AI applications is time-critical, even though AI regulations are still being developed. The gatekeeper for digital transformation must be a trustworthy AI, and DEKRA supports clients in their effort to guarantee that their AI technology is reliable, secure, and safe through our testing and advisory services.”

Read More: The Practical Applications of AI in Workplace

[To share your insights with us, please write to sghosh@martechseries.com]

Comments are closed.