Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

6th-Gen TPU Trillium Intensifies the Competition in AI Accelerator Market

The  global market for AI accelerator chips has gathered momentum since ChatGPT3’s launch in November 2022. Today, AI accelerator chip providers are considered to be the backbone of majorly disruptive industries. Organizations in healthcare, gaming, manufacturing, and software development are among the top users of AI accelerators where digital transformation is carried out through the adoption of the Internet of Things (IoT), robotic automation, Natural language processing, and blockchain. According to a recent research published by GlobalData, Google’s 6th-gen tensor processing unit (TPU) named Trillium has intensified the competitiveness of the AI accelerator market.

What is AI Accelerator?

We came across so many online definitions for AI accelerators. We have compiled the best definitions here for your reference, along with the sources.

“An AI accelerator is a high-performance parallel computation machine that is specifically designed for the efficient processing of AI workloads like neural networks.” – by Synopsys

“An AI accelerator refers to specialized hardware designed to efficiently process AI and machine learning algorithms, particularly those involving deep learning, neural networks, and machine vision.” –  [Source]

IBM defined the AI Machine learning accelerator through its enterprise product – Watson.

IBM stated Watson Machine Learning Accelerator is an enterprise AI infrastructure to make deep learning and machine learning more accessible to users. It combines popular open source deep learning frameworks with efficient AI development tools to benefit businesses. The emphasis on open source is growing, and that’s why IBM doubled-down on its AI strategy for the open-source community. At the Think 2024 Keynote, IBM’s SVP and Director of Research Darío Gil laid the groundwork for AI users, asking them to create value using game-changing tools to extract valuable insights from their data. AI accelerators, therefore, can boost performance and hyperscale AI development across multiple applications and business use cases.

So, AI accelerator is a specialized micro-processor designed to perform high-speed computing for AI tasks.

What is Trillium AI Accelerator?

Related Posts
1 of 7,011

Trillium is a sixth generation Google Cloud TPU. It is regarded as the best performing and energy-efficient TPU for the advanced machine learning models. What makes Trillium so powerful?

Trillium’s power can be gauged from its presence in Google Cloud’s AI Hypercomputer. This supercomputer is tailor-made to support cutting-edge AI accelerators for large-size ML models on different platforms, including Vertex AI Training, Google Kubernetes Engine (GKE) or Google Cloud Compute Engine.

At its core, Google Cloud TPU is embedded with a third-generation AI accelerator called SparseCore. SparseCore makes Trillium TPUs serve all the advanced machine learning models with reduced latency, lower energy and cost. This TPU is 67% more energy-efficient compared to its predecessor — TPU v5e. In addition to saving on energy, Trillium also boasts of a significantly improved compute speed– almost 5x more per Trillium chip compared to the older chip model. This increase in clock speed ensures higher-speed for embedding-heavy AI ML workloads operating Gemini 1.5 Flash, Imagen 3, and Gemma 2.

According to GlobalData, Trillium presents an exciting opportunity to AI accelerator market. While Google is ahead of the competitors, there is a push coming from the AI startups. In fact, the report mentions a robust growth is likely once startups begin to challenge the incumbent players. There are 17 new companies entering the market, with new patents and applications. A majority of them are tilting the paradigm toward edge devices, followed by autonomous driving and others.

Google’s Trillium TPU itself partnered with Essential AI, an autonomous vehicle company, to expand the scope of integrating computer-driven intelligence with a human interface. Similarly, TPU with AI accelerator is transforming patient care experiences, generative AI outcomes, and robotic process automation workflows.

Keeping an Eye on AI Accelerator Market

The report named NVIDIA, Intel, Cortica, Meta Platforms, Micron, Edgecortix and others in its leaders quadrant. Western Digital, Wave Computing, Hailo and others are challengers. Toshiba, Qualcomm, Fujitsu, HP, IBM, Baidu and Tesla are “laggards”!

Microsoft, AMD, Grow, AMD, Huawei, and others are named as “explorers” in the GlobalData report!

 

TF-NL-20-2024-Chart_1.png

GlobalData found 20% of the patents in the AI accelerator chip technology were filed by the startups. Most accelerators are trying to find their own market, and want to move away from NVIDIA, Intel and AMD. In the coming months, we could see weighty investments and acquisitions in this domain.

Comments are closed.