Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

NVIDIA Will Design New AI Chips Every Year

What is The News About?

Nvidia introduced the Ampere chip in 2020, the Hopper architecture in 2022, and Blackwell, its latest offering, early this year. Analyst Ming-Chi Kuo had predicted that the new Rubin chip would debut in 2025 in a study that was published earlier this month. R100 GPUs are anticipated to be powered by the processor. The H100 GPU has been the darling of the industry for training AI models, but now even smaller tech companies are vying for it.

Huang elaborated by saying that the business also intends to speed up manufacturing of other chips. “We will move forward with all of them at a tremendous pace.”A deluge of new processors, including central processing units, graphics processing units, networking network interface cards, switches, and more, are on the way, he proclaimed. Due to the new AI chips’ electrical and mechanical backward compatibility and shared software, Huang said that firms will have little trouble making the switch from, example, the Hopper to the Blackwell.

Week’s Top Read Insight:10 AI ML In Supply Chain Management Trends To Look Out For In 2024

Why Is It Important?

Big Tech businesses have hurried to purchase Nvidia’s GPUs, which are in great demand, due to the fiercely contested AI race. The acquisition of around 350,000 H100 GPUs for computing was announced in January this year by Meta CEO Mark Zuckerberg. Computing power is also being increased by self-driving businesses. Tesla bought 35,000 graphics processing units (GPUs) to train its whole autonomous driving system, according to Nvidia’s CEO.

The reason for the huge demand was explained by Huang. There will be an announcement of a revolutionary AI made by the next business that achieves the next significant plateau, and then another announcement of an improvement of 0.3%. He inquired as to whether the company’s goal was to be the one to provide revolutionary AI or one that was only slightly better.

Related Posts
1 of 41,228

Read: 10 AI ML In Data Storage Trends To Look Out For In 2024

Nvidia faces competition from tech businesses that have begun to develop their own artificial intelligence chips in response to rising computation costs and consumer demand. A necessity to keep the lead in artificial intelligence computing is behind Huang’s decision to move quickly.

Must Read: What is Experience Management (XM)?

Benefits

1. Increased Technological Innovation and Performance- Nvidia’s continuous development and release of advanced chips such as Ampere, Hopper, Blackwell, and the anticipated Rubin chip exemplify their commitment to pushing the boundaries of technological innovation. Each new generation of GPUs and processors brings significant improvements in performance, energy efficiency, and capabilities, particularly in AI and machine learning applications. For example, the H100 GPU has become a critical asset for training AI models, and with new iterations like Blackwell, smaller tech companies can also access high-performance computing power, fostering broader advancements in AI research and development.

2. Seamless Transition and Compatibility for Businesses- Nvidia’s strategy to ensure electrical and mechanical backward compatibility, along with shared software across their new AI chips, simplifies the transition process for businesses upgrading their hardware. This backward compatibility means that companies can switch from using Hopper GPUs to Blackwell GPUs without facing significant integration issues. This ease of upgrade not only reduces downtime and transition costs but also encourages businesses to adopt newer technologies more quickly, thereby maintaining competitive advantages in their respective industries through continual improvement in their AI and computing capabilities.

[To share your insights with us as part of editorial or sponsored content, please write to sghosh@martechseries.com]

Comments are closed.