Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

Cerebras Systems Accelerates Global Growth with New India Office

Led by Former Intel Leader, Lakshmi Ramachandran, Cerebras Expands R&D Operations and Strengthens Local Customer Growth

Cerebras Systems, the pioneer in accelerating artificial intelligence (AI) compute, announced its continued global expansion with the opening of a new India office in Bangalore, India. Led by industry veteran Lakshmi Ramachandran, the new engineering office will focus on accelerating R&D efforts and supporting local customers. With a target of more than sixty engineers by year end, and more than twenty currently employed, Cerebras is looking to rapidly build its presence in Bangalore.

Recommended AI News: Salesforce Partner Spaulding Ridge Announces Expansion with Revenue Lifecycle Management Practice

“India in general and Bangalore in particular, is extremely well-positioned to be a hotbed for AI innovation. It has world leading universities, pioneering research institutions and a large domestic enterprise market,” said Andrew Feldman, CEO and Co-Founder of Cerebras Systems. “Cerebras is committed to being a leader in this market. Under Lakshmi’s leadership, we are rapidly hiring top-notch engineering talent for Cerebras Systems India, as well as supporting sophisticated regional customers who are looking to do the most challenging AI work more quickly and easily.”

As part of the India expansion, Cerebras appointed Ramachandran as Head of Engineering and India Site Lead at Cerebras India. Based in Bangalore, Ramachandran brings more than 24 years of technical and leadership experience in Software and Engineering. Prior to joining Cerebras, she was with Intel in various engineering and leadership roles. Most recently, she was Senior Director at Intel’s Data Center and AI group, responsible for delivering key capabilities of deep learning software for AI accelerators. She has extensive experience in scaling business operations and establishing technical engineering teams in India.

“I am honored to be part of Cerebras Systems’ mission to change the future of AI compute, and to work with the extraordinary team that made wafer scale compute a reality,” said Ramachandran. “We have already begun to build a world class team of top AI talent in India, and we are excited to be building core components here that are critical to the success of Cerebras’ mission. We look forward to adding more technology and engineering talent as we support the many customer opportunities we have countrywide.”

Related Posts
1 of 40,423

Recommended AI News: AI Image Creation Goes Mainstream with Latest App Rollouts from Lightricks

The Cerebras CS-2 is the fastest AI system in existence and it is powered by the largest processor ever built – the Cerebras Wafer-Scale Engine 2 (WSE-2), which is 56 times larger than the nearest competitor. As a result, the CS-2 delivers more AI-optimized compute cores, more fast memory, and more fabric bandwidth than any other deep learning processor in existence. It was purpose built to accelerate deep learning workloads, reducing the time to answer by orders of magnitude.

With a CS-2, Cerebras recently set a world record for the largest AI model trained on a single device. This is important because with natural language processing (NLP), larger models trained with large datasets are shown to be more accurate. But traditionally, only the largest and most sophisticated technology companies had the resources and expertise to train massive models across hundreds or thousands of graphics processing units (GPU). By enabling the capability to train GPT-3 models with a single CS-2, Cerebras is enabling the entire AI ecosystem to set up and train large models in a fraction of the time.

Customers around the world are already leveraging the Cerebras CS-2 to accelerate their AI research across drug discovery, clean energy exploration, cancer treatment research and more. For example, pharmaceutical leader GSK is now able to train complex epigenomic models with a previously prohibitively large dataset – made possible for the first time with Cerebras. AstraZeneca is iterating and experimenting in real-time by running queries on hundreds of thousands of abstracts and research papers – a process that previously took over two weeks with a GPU cluster and is now being achieved in just over two days with Cerebras. And Argonne National Laboratory is using a CS-2 to figure out how the virus that causes COVID-19 works. They are able to run simulations with a single CS-2 that would require 110-120 GPUs.

Recommended AI News: Google Cloud Announces Cloud-First Partnership with KeyBank

[To share your insights with us, please write to sghosh@martechseries.com]

Comments are closed.