Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

Ampere To Acquire OnSpecta To Accelerate AI Inference On Cloud-Native Applications

Boosts AI Performance of Ampere® Altra® Family Across Cloud and Edge Infrastructure

Ampere® Computing announced it has agreed to acquire AI technology startup OnSpecta, strengthening Ampere® Altra® performance with AI inference applications.  The OnSpecta Deep Learning Software (DLS) AI optimization engine can deliver significant performance enhancements over commonly used CPU-based machine learning (ML) frameworks. The companies have already been collaborating and have demonstrated over 4x acceleration on Ampere-based instances running popular AI-inference workloads. The acquisition will include an optimized model zoo with object detection, video processing and recommendation engines. Terms were not disclosed and the acquisition is expected to close in August, subject to customary closing conditions.

Recommended AI News: Blue Prism Partners With Alteryx To Drive Faster, More Reliable Data Analytics

“We are excited to welcome the talented OnSpecta team to Ampere,” said Renee James, founder, chairman and CEO of Ampere Computing.  ” The addition of deep learning expertise will enable Ampere  to deliver a more robust platform for inference task processing with lower power, higher performance and better predictability than ever. This acquisition underscores our commitment to delivering a truly differentiated cloud native computing platform for our customers in both cloud and edge deployments.”

Related Posts
1 of 40,728

According to IDC Research,  the AI server market is expected to be over $26B by 2024 with an annual growth rate of 13.7%.  Ampere customers are seeking ways to manage the costs and the growing requirements for AI inference tasks in both centralized and edge-based infrastructure scenarios. DLS is a seamless binary drop-in library for many popular AI frameworks and will accelerate inference significantly on Ampere Altra.  It enables the use of the Altra-native FP16 data format that can double performance over FP32 formats without significant accuracy loss or model retraining.

SysAdmin Appreciation Day: Top Industry Leaders Share their Insights on IT and Data Ops

“This is a natural progression to the strong collaboration we already have with Ampere,” said Indra Mohan, co-founder and CEO of OnSpecta. “Our team will greatly benefit from being a part of Ampere as we help build upon the great success of Ampere Altra and provide critical support to customers as they apply the Altra product family to a wide variety of AI inference use cases.”

“We have already seen the powerful performance and ease-of-use of Ampere Altra and OnSpecta on the Oracle OCI Ampere A1 instance,” said Clay Magouyrk, executive vice president of Oracle Cloud Infrastructure. “With DLS compatibility on all major open source AI frameworks including Tensorflow, PyTorch and ONNX, and the predictable performance of Ampere Altra, we expect to see continued innovation on OCI Ampere A1 shapes for AI inference workloads.”

Recommended AI News: Joint Artificial Intelligence Center To Pilot A Responsible AI Procurement Process

Comments are closed.