Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

Cirrascale Cloud Services And Cerebras Systems Announce Availability Of Cerebras Cloud Cirrascale For Unparalleled Artificial Intelligence Performance

New Partnership Democratizes AI by Delivering Highest Performing AI Compute and Massively Scalable Deep Learning in an Accessible, Easy to Use, Affordable Cloud Solution

Cirrascale Cloud Services, a provider of deep learning infrastructure solutions for autonomous vehicle, Natural Language Processing (NLP) and computer vision workflows announced the availability of its Cerebras Cloud @ Cirrascale platform. Delivering fast and flexible training with easy programming using standard ML frameworks and Cerebras’ software, the Cerebras Cloud @ Cirrascale brings the industry-leading performance of Cerebras’ Wafer-Scale Engine (WSE-2) deep learning processor to more users through Cirrascale’s proven scalable and accessible cloud service.

Recommended AI News: RapidSOS Introduces Emergency Data Exchange (EDX) Aimed At Solving the Interoperability Challenge…

“Cirrascale is thrilled to bring Cerebras’ world-class artificial intelligence (AI) compute power to the cloud – we are truly democratizing AI by broadening deep learning access and enabling large-scale commercial deployments across leading Fortune 500 customers, research organizations and innovative startups,” said PJ Go, CEO, Cirrascale Cloud Services. “The Cerebras Cloud @ Cirrascale will advance the future of deep learning and greatly accelerate the performance of AI workloads, all through a single device that is easy to use and simple to deploy.”

Cerebras Cloud delivers the wall-clock training performance of many 10s to 100s of GPUs for training and inference workloads on a single CS-2 system, enabling cluster-scale performance with single-node programming simplicity with standard ML frameworks. In a recent comparison, a Cerebras system showed 9.5x faster training time to accuracy on a BERT-type model for NLP compared to an 8-GPU server. Since training time does not scale linearly with GPUs, users would need more than 120 GPUs to train the model to the same accuracy as a single CS-2. The WSE-2 powering the Cerebras Cloud uniquely handles fine-grained and dynamic sparsity that are common in neural network workloads, but are challenging for traditional processors.

Related Posts
1 of 40,459

Recommended AI News: Zype Acquires MAZ Systems to Extend Leading Video API and Infrastructure Platform with No-Code TV…

“We are excited to partner with Cirrascale and introduce the Cerebras Cloud @ Cirrascale, bringing the power and performance of our CS-2 system to more customers looking to glean world-changing insights from massive datasets sooner,” said Andrew Feldman, CEO and Co-Founder of Cerebras. “We designed the WSE-2 with 850,000 AI-optimized cores so that the CS-2 system at the foundation of the Cerebras Cloud @ Cirrascale offering can deliver cluster-scale acceleration of AI workloads easily with a single system, thereby enabling customers to deploy solutions faster, using large, state-of-the-art models for training or inference.”

The Cerebras Cloud @ Cirrascale is currently available in weekly or monthly flat-rate allotments with discounts offered for longer-term use with predictable pricing and no extra charges. Additionally, Cirrascale can easily integrate its Cerebras Cloud with a customer’s current online cloud-based workflow at various hyperscalers, creating a secure, multi-cloud solution.

With every component optimized for AI work, the Cerebras Cloud @ Cirrascale delivers more compute performance at less space and less power than any other solution. Depending on workload, from AI to HPC, it delivers hundreds or thousands of times more performance than legacy alternatives, but uses only a fraction of the space and power. Cerebras Cloud is designed to enable fast, flexible training and low-latency datacenter inference, thanks to greater compute density, faster memory, and higher bandwidth interconnect than any other datacenter AI solution.

Recommended AI News: Fintica AI Completes Financial Market Manipulation Detection Pilot For Israel Securities Authority

[To share your insights with us, please write to sghosh@martechseries.com ]

Comments are closed.