Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

Lightning AI Releases PyTorch Lightning 2.0 and a New Open Source Library for Lightweight Scaling of Machine Learning Models

Lightning AI, the company accelerating the development of an AI-powered world, announced the general availability of PyTorch Lightning 2.0, the company’s flagship open source AI framework used by more than 10,000 organizations to quickly and cost-efficiently train and scale machine learning models. The new release introduces a stable API, offers a host of powerful features with a smaller footprint, and is easier to read and debug. Lightning AI has also unveiled Lightning Fabric to give users full control over their training loop. This new library allows users to leverage tools like callbacks and checkpoints only when needed, and also supports reinforcement learning, active learning and transformers without losing control over training code.

Recommended AI: Google Cloud Announces Biggest-ever Upgrade to Vertex AI

The new releases are available for download Users seeking a simple, scalable training method that works out of the box can use PyTorch Lightning 2.0, while those looking for additional granularity and control throughout the training process can use Lightning Fabric. By extending its portfolio of open source offerings, Lightning AI is supporting a wider range of individual and enterprise developers as advances in machine learning are growing exponentially.

Until now, machine learning practitioners have had to choose between two extremes: either using prescriptive tools for training and deploying machine learning tools or figuring it out completely on their own. With the update to PyTorch Lightning, and the introduction of Lightning Fabric, Lightning AI is now offering users an extensive array of training options for their machine learning models. If users want features like checkpointing, logging, and early stopping to be included out of the box, the original PyTorch Lightning framework is the right choice for them. If users want to scale their model to multiple GPUs but also want to write their own training loop for tasks like reinforcement learning, a method used by OpenAI’s ChatGPT, then Fabric is the right choice for them.

Related Posts
1 of 41,006

Recommended AI: AI in Retail: Israeli Startup Hexa is Enabling Shoppers to View Products in 360 Degrees & Try Them Virtually

“The dual launch of PyTorch Lightning 2.0 and Lightning Fabric marks a significant milestone for Lightning AI and our community,“ said William Falcon, creator of PyTorch Lightning and the CEO and co-founder of Lightning AI. “Each release provides a robust set of options for companies with in-house machine learning developers interested in additional granularity and those who want a simple scaling solution that works out of the box. As the size of models deployed in enterprise settings continues to expand, the ability to scale the training process in a way that is simple, lightweight, and cost-effective becomes increasingly important. Lightning AI is proud to be expanding the choices that users have to scale their models for downstream tasks whether those are internal, like analytics and content creation tools, or customer-facing, like ChatGPT and other customer service chatbots.”

Recommended AI: Is Customer Experience Strategy Making or Breaking Your ‘Shopping Festival’ Sales?

[To share your insights with us, please write to sghosh@martechseries.com]

Comments are closed.