Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

WekaIO Introduces Weka AI to Enable Accelerated Edge to Core to Cloud Data Pipelines

WekaIO (Weka), the innovation leader in high-performance and scalable file storage, and an NVIDIA Partner Network Solution Advisor introduced Weka AI, a transformative storage solution framework underpinned by the Weka File System (WekaFS) that enables accelerated edge-to-core-to-cloud data pipelines. Weka AI is a framework of customizable reference architectures (RAs) and software development kits (SDKs) with leading technology alliances like NVIDIA, Mellanox, and others in the Weka Innovation Network (WIN)™. Weka AI enables chief data officers, data scientists and data engineers to accelerate genomics, medical imaging, the financial services industry (FSI), and advanced driver-assistance systems (ADAS) deep learning (DL) pipelines. In addition, Weka AI easily scales from entry to large integrated solutions provided through VARs and channel partners.

Artificial Intelligence (AI) data pipelines are inherently different from traditional file-based IO applications. Each stage within AI data pipelines has distinct storage IO requirements: massive bandwidth for ingest and training; mixed read/write handling for extract, transform, load (ETL); ultra-low latency for inference; and a single namespace for entire data pipeline visibility. Furthermore, AI at the edge is driving the need for edge-to-core-to-cloud data pipelines. Hence, the ideal solution must meet all these varied requirements and deliver timely insights at scale. Traditional solutions lack these capabilities and often fall short in meeting performance and shareability across personas and data mobility requirements. Today’s industries demand solutions that overcome these limitations for AI data pipelines. The solutions must provide data management that delivers operational agility, governance, and actionable intelligence by breaking silos.

Recommended AI News: Nissan Moves to Oracle Cloud Infrastructure for High-Performance Computing

Weka AI is architected to deliver production-ready solutions and accelerate DataOps by solving the storage challenges common with IO-intensive workloads such as AI. It helps accelerate the AI data pipeline, delivering more than 73 GB/sec of bandwidth to a single GPU client. In addition, it delivers operational agility with versioning, explainability, and reproducibility and provides governance and compliance with in-line encryption and data protection. Engineered solutions with partners in the WIN program ensure that Weka AI will provide data collection, workspace and deep neural network (DNN) training, simulation, inference, and lifecycle management for the entire data pipeline.

Kevin Tubbs, Senior Director, Technology and Business Development, Advanced Solutions Group, Penguin Computing
“Weka AI provides a solution to meet the requirements of modern AI applications. We are very excited to be working with Weka to accelerate next generation AI data pipelines.”

Amrinderpal Singh Oberai, Director, Data & AI, Groupware Technology
“The Groupware Data & AI team has tested and validated the Weka AI reference architecture in-house. The Weka AI framework provides us and other ISV technology partners with the flexibility and technology innovation to build custom solutions to solve industry challenges through AI.”

Related Posts
1 of 40,648

Paresh Kharya, Director of Product Management for Accelerated Computing, NVIDIA
“End-to-end application performance for AI requires feeding high-performance NVIDIA GPUs with a high-throughput data pipeline. Weka AI leverages GPUDirect storage to provide a direct path between storage and GPUs, eliminating I/O bottlenecks for data intensive AI applications.”

Gilad Shainer, Senior Vice President Marketing, Mellanox Technologies
“InfiniBand has become the de-facto standard for high performance and scalable AI infrastructures, delivering high data throughout and enabling In-Network Computing Acceleration Engines. Together with the Weka AI framework, GPUDirect and GPUDirect storage over InfiniBand provide our mutual customers with a world-leading platform for AI applications.”

Recommended AI News: Analyst1 Wages War on Cybercrime with Next Generation Threat Intelligence Platform

Liran Zvibel, CEO and Co-Founder, WekaIO
“GPUDirect Storage eliminates IO bottlenecks and dramatically reduces latency, delivering full bandwidth to data-hungry applications. By supporting GPUDirect Storage in its implementations, Weka AI continues to deliver on its promise of highest performance at any scale for the most data-intensive applications.”

Shailesh Manjrekar, Head of AI and Strategic Alliances, WekaIO

“We are very excited to launch Weka AI to help businesses embark on their digital transformation journey. Line-of-business users as well as IT leaders can now implement AI 2.0 and cognitive computing workflows that scale, accelerate, and derive actionable business insights, thus enabling Accelerated DataOps.”

Gartner Accelerate Your Machine Learning and Artificial Intelligence Journey Using These DevOps Best Practices, Nov. 12, 2019, Arun Chandrasekaran, Farhan Choudhary
“The DevOps culture transformed the agility in code development, and a similar culture is required on the data side as well. Therefore, creating a DataOps culture has become a critical discipline for any organization that wants to have a competitive edge and survive in this data-driven market.”

Amita Potnis, Research Director, Enterprise Infrastructure Practice, IDC
“A new class of intelligent data operations platforms is emerging that can reduce friction, improve efficiencies with automation, and provide flexibility and openness with policy and metadata-driven processes that can accommodate the diversity and distribution of data in modern environments.”

Recommended AI News: Sony’s Ci Media Cloud Expands into Cloud-Native Asset Management

Comments are closed, but trackbacks and pingbacks are open.