DataON announces Intel Select Solutions for AI Inferencing
Accelerate artificial intelligence (AI) inferencing and deployment on an optimized, verified infrastructure based on Intel technology
DataON, the industry-leading provider of hybrid cloud solutions for Microsoft Azure Stack HCI with cloud-based Azure Services, has collaborated with Intel to offer verified Intel Select Solutions for AI Inferencing, powered by 2nd Gen Intel Xeon Scalable processors.
Businesses are increasingly looking to artificial intelligence (AI) to increase revenue, drive efficiencies, and innovate their offerings. AI use cases powered by deep learning (DL) generate some of the most powerful and useful insights. Some of these use cases can enable advances across numerous industries such as image classification, object detection, image segmentation, natural language processing, and recommended systems.
Recommended AI News: Gopal Hegde Joins SiMa.Ai To Lead Engineering And Operations
Intel Select Solutions for AI Inferencing provide you with a jumpstart to deploying efficient AI inferencing algorithms on solutions built on validated Intel architecture so you can innovate and go to market faster. To speed AI inferencing and time to market for applications built on AI, Intel Select Solutions for AI Inferencing combine several Intel and third-party software and hardware technologies.
Intel Select Solutions for AI Inferencing combine 2nd Gen Intel Xeon Scalable processors, Intel Optane solid state drives (SSDs), and Intel 3D NAND SSDs, so your business can quickly deploy a production-grade AI infrastructure built on a performance-optimized platform that offers high-capacity memory for the most demanding applications and workloads.
For Base configurations, the Intel Xeon Gold 6248 processor provides an optimized balance of price, performance, and built-in technologies that enhances performance and efficiency for inferencing on AI models. 2nd Gen Intel Xeon Scalable processors include Intel Deep Learning Boost, a family of acceleration features that improves AI inference performance using the specialized Vector Neural Network Instructions (VNNI) instruction set.
Intel Optane technology fills critical gaps in the storage and memory hierarchy, enabling data centers to accelerate their access to data. This technology also disrupts the memory and storage tier, delivering persistent memory, large memory pools, fast caching, and storage in a variety of products and solutions.
AI inferencing performs best when the cache tier is on fast SSDs with low latency and high endurance. Intel Optane SSDs are used to power the cached tier in these Intel Select Solutions. Intel Optane SSDs offer high input/output (I/O) operations per second (IOPS) per dollar with low latency, coupled with 30 drive-write-per-day endurance, so they are ideal for write-heavy cache functions. The capacity tier is served by Intel 3D NAND SSDs, delivering optimized read performance with a combination of data integrity, performance consistency, and drive reliability.
Recommended AI News: Former Dell Technologies Executive Joins Automation Anywhere As Chief Revenue Officer
INTEL SELECT SOLUTIONS
These solutions add to DataON’s first generation of Intel Select Solutions for Windows Server Software-Defined Storage, second generation Intel Select Solutions for Azure Stack HCI, and Intel Select Solutions for Microsoft SQL Server. These solutions are designed by DataON and verified by Intel to reduce the time required to evaluate, select, and purchase hardware for today’s workloads and applications. They are vigorously benchmark-tested with today’s high-priority workloads, helping businesses realize smooth deployments and optimal performance. By eliminating guesswork and ensuring predictability with pre-defined configurations, businesses can take advantage of new technologies faster.
“DataON was an inaugural partner for Intel Selection Solutions for Windows Server Software-Defined Storage, and we are proud to continue to add to our family of Intel Select Solution offerings,” said Howard Lo, vice president of sales and marketing, DataON. “Having these validated solutions demonstrates our commitment to providing Intel-based solutions for popular workloads such Azure Stack HCI, SQL Server, and AI Inferencing. Customers will be able to realize a faster time-to-value and reduced operational costs with these pre-validated solutions.
“Substantial resources are needed to support inferencing on deep neural networks and require high performance and consistent quality for AI inferencing,” said Jake Smith, Director, Data Center Technologies at Intel. “Intel Select Solutions for AI Inferencing, delivered through industry leaders like DataON, accelerate time-to-deployment and help protect your IT investments and provide our customers with a turnkey platform solution composed of validated Intel architecture building blocks for low-latency, high-throughput inference performed on a CPU.”
Recommended AI News: Sigma Computing Unveils Major Updates To Cloud Solution, Powering Community-Driven A&BI
Comments are closed, but trackbacks and pingbacks are open.