Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

Normal Computing Unveils the First-ever Thermodynamic Computer

Deep tech AI startup’s breakthrough in physics-based computing portends future energy-efficient, AGI-capable processors

Normal Computing, a deep tech AI startup founded by former Google Brain and Alphabet X engineers to develop full-stack applications with enterprise reliability, today unveiled the world’s first thermodynamic computer. Normal’s team conducted the first-ever thermodynamic AI experiment using the prototype hardware to add reliability and controls to the outputs of a neural network – work that could one day help eliminate hallucinations in AI models, and enable AI agents that can reason about the world, yet are controllable and safe.

AI applications, like those powered by generative AI and large language models, require massive resources, and today’s computers may not be powerful enough to unlock the full scope of applications. However, the energy required for today’s advanced computers will only become a bigger problem as AI models grow in size, with energy consumption already a major issue for today’s Graphical Processing Units (GPUs).

Recommended AI News: Riding on the Generative AI Hype, CDP Needs a New Definition in 2024

Furthermore, even cutting-edge generative AI solutions can be unreliable and unusable in mission-critical applications. Properly accounting for uncertainty using probabilistic AI methods may be essential for AI agents to plan, reason, and have common sense. And new approaches like with probabilistic AI add even more scaling complexity, particularly for GPUs, but may be thousands of times more efficient on a thermodynamic computer as indicated in Normal’s latest research paper.

Related Posts
1 of 40,668

Recommended AI News: Stellar Cyber and Proofpoint Strategic Alliance to Deliver Email Security Solution

“Much of the mainstream discussion around AI this year was dominated by ChatGPT and consumer curiosities, but the race is on to deliver meaningful impact and solve mission-critical enterprise problems,” said Faris Sbahi, CEO and Co-Founder at Normal Computing. “Our experiment is one important step as part of our overall bet to scale AI agents that can natively reason, and are reliable and capable enough to be trusted as partners within important institutions like in semiconductors and financial services.”

Quantum computing has promised to deliver commercially relevant speedups for machine learning, linear algebra, and generative AI but the impact of quantum computers is still several years away. The thermodynamic paradigm is more near term, relying on standard semiconductor fabrication and operating at room temperature.

Recommended AI News: Microsoft Teams and Class Deliver the Next Generation Virtual Classroom

“The promise of Quantum has not lived up to its hype. Quantum computers are largely still in the academic stage, and haven’t yet delivered commercial value. Advanced compute, not talent, may be the limiting resource in AI’s benefits becoming ubiquitous. Our goal is to unlock a step change in AI capabilities and efficiency through our novel thermodynamic stack,” said Patrick Coles, Chief Scientist at Normal Computing and former head of Quantum Computing at Los Alamos National Laboratory.

[To share your insights with us, please write to sghosh@martechseries.com]

Comments are closed.