Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

Google Cloud and NVIDIA Expand Partnership to Advance AI Computing, Software and Services

NVIDIA Generative AI Technology Used by Google DeepMind and Google Research Teams Now Optimized and Available to Google Cloud Customers Worldwide

Google Cloud and NVIDIA announced new AI infrastructure and software for customers to build and deploy massive models for generative AI and speed data science workloads.

AiThority Interview Insights: AiThority Interview with Ketan Karkhanis, EVP & GM, Sales Cloud at Salesforce

In a fireside chat at Google Cloud Next, Google Cloud CEO Thomas Kurian and NVIDIA founder and CEO Jensen Huang discussed how the partnership is bringing end-to-end machine learning services to some of the largest AI customers in the world — including by making it easy to run AI supercomputers with Google Cloud offerings built on NVIDIA technologies. The new hardware and software integrations utilize the same NVIDIA technologies employed over the past two years by Google DeepMind and Google research teams.

“We’re at an inflection point where accelerated computing and generative AI have come together to speed innovation at an unprecedented pace,” Huang said. “Our expanded collaboration with Google Cloud will help developers accelerate their work with infrastructure, software and services that supercharge energy efficiency and reduce costs.”

“Google Cloud has a long history of innovating in AI to foster and speed innovation for our customers,” Kurian said. “Many of Google’s products are built and served on NVIDIA GPUs, and many of our customers are seeking out NVIDIA accelerated computing to power efficient development of LLMs to advance generative AI.”

NVIDIA Integrations to Speed AI and Data Science Development
Google’s framework for building massive large language models (LLMs), PaxML, is now optimized for NVIDIA accelerated computing.

Originally built to span multiple Google TPU accelerator slices, PaxML now enables developers to use NVIDIA® H100 and A100 Tensor Core GPUs for advanced and fully configurable experimentation and scale. A GPU-optimized PaxML container is available immediately in the NVIDIA NGC™ software catalog. In addition, PaxML runs on JAX, which has been optimized for GPUs leveraging the OpenXLA compiler.

Google DeepMind and other Google researchers are among the first to use PaxML with NVIDIA GPUs for exploratory research.

Related Posts
1 of 41,164

The NVIDIA-optimized container for PaxML will be available immediately on the NVIDIA NGC container registry to researchers, startups and enterprises worldwide that are building the next generation of AI-powered applications.

Additionally, the companies announced Google’s integration of serverless Spark with NVIDIA GPUs through Google’s Dataproc service. This will help data scientists speed Apache Spark workloads to prepare data for AI development.

Read More about AiThority InterviewAiThority Interview with Dr. Mark Austin, VP Data Science at AT&T

These new integrations are the latest in NVIDIA and Google’s extensive history of collaboration. They cross hardware and software announcements, including:

  • Google Cloud on A3 virtual machines powered by NVIDIA H100 — Google Cloud announced today its purpose-built Google Cloud A3 VMs powered by NVIDIA H100 GPUs will be generally available next month, making NVIDIA’s AI platform more accessible for a broad set of workloads. Compared to the previous generation, A3 VMs offer 3x faster training and significantly improved networking bandwidth.
  • NVIDIA H100 GPUs to power Google Cloud’s Vertex AI platform — H100 GPUs are expected to be generally available on VertexAI in the coming weeks, enabling customers to quickly develop generative AI LLMs.
  • Google Cloud to gain access to NVIDIA DGX™ GH200 — Google Cloud will be one of the first companies in the world to have access to the NVIDIA DGX GH200 AI supercomputer — powered by the NVIDIA Grace Hopper™ Superchip — to explore its capabilities for generative AI workloads.
  • NVIDIA DGX Cloud Coming to Google Cloud — NVIDIA DGX Cloud AI supercomputing and software will be available to customers directly from their web browser to provide speed and scale for advanced training workloads.
  • NVIDIA AI Enterprise on Google Cloud Marketplace — Users can access NVIDIA AI Enterprise, a secure, cloud native software platform that simplifies developing and deploying enterprise-ready applications including generative AI, speech AI, computer vision, and more.
  • Google Cloud first to offer NVIDIA L4 GPUs — Earlier this year, Google Cloud became the first cloud provider to offer NVIDIA L4 Tensor Core GPUs with the launch of the G2 VM. NVIDIA customers switching to L4 GPUs from CPUs for AI video workloads can realize up to 120x higher performance with 99% better efficiency. L4 GPUs are used widely for image and text generation, as well as VDI and AI-accelerated audio/video transcoding.

Google Cloud accelerates every organization’s ability to digitally transform its business and industry. We deliver enterprise-grade solutions that leverage Google’s cutting-edge technology, and tools that help developers build more sustainably. Customers in more than 200 countries and territories turn to Google Cloud as their trusted partner to enable growth and solve their most critical business problems.

Since its founding in 1993, NVIDIA has been a pioneer in accelerated computing. The company’s invention of the GPU in 1999 sparked the growth of the PC gaming market, redefined computer graphics and ignited the era of modern AI. NVIDIA is now a full-stack computing company with data-center-scale offerings that are reshaping industry

 Latest AiThority Interview Insights : AiThority Interview with Matthew Tillman, CEO and Co-founder at OpenEnvoy

 [To share your insights with us, please write to sghosh@martechseries.com] 

Comments are closed.