NXP Announces Expansion of Its Scalable Machine Learning Portfolio
NXP makes a strategic investment with Au-Zone Technologies to expand eIQ Machine Learning development environment
NXP Semiconductors N.V. announced that it is enhancing its machine learning development environment and product portfolio. Through an investment, NXP has established an exclusive, strategic partnership with Canada-based Au-Zone Technologies to expand NXP’s eIQ Machine Learning (ML) software development environment with easy-to-use ML tools and expand its offering of silicon-optimized inference engines for Edge ML.
Additionally, NXP announced that it has been working with Arm as the lead technology partner in evolving Arm® Ethos-U™ microNPU (Neural Processing Unit) architecture to support applications processors. NXP will integrate the Ethos-U65 microNPU into its next generation of i.MX applications processors to deliver energy-efficient, cost-effective ML solutions for the fast-growing Industrial and IoT Edge.
“NXP’s scalable applications processors deliver an efficient product platform and a broad ecosystem for our customers to quickly deliver innovative systems,” said Ron Martino, Senior Vice President and General Manager of Edge Processing at NXP Semiconductors. “Through these partnerships with both Arm and Au-Zone, in addition to technology developments within NXP, our goal is to continuously increase the efficiency of our processors while simultaneously increasing our customers’ productivity and reducing their time to market. NXP’s vision is to help our customers achieve lower cost of ownership, maintain high levels of security with critical data, and to stay safe with enhanced forms of human-machine-interaction.”
Enabling Machine Learning for All
Au-Zone’s DeepView ML Tool Suite will augment eIQ with an intuitive, graphical user interface (GUI) and workflow, enabling developers of all experience levels to import datasets and models, rapidly train, and deploy NN models and ML workloads across the NXP Edge processing portfolio. To meet the demanding requirements of today’s industrial and IoT applications, NXP’s eIQ-DeepViewML Tool Suite will provide developers with advanced features to prune, quantize, validate, and deploy public or proprietary NN models on NXP devices. It’s on-target, graph-level profiling capability will provide developers with unique, run-time insights to optimize NN model architectures, system parameters, and run-time performance. By adding Au-Zone’s DeepView run-time inference engine to complement open source inference technologies in NXP eIQ, users will be able to quickly deploy and evaluate ML workloads and performance across NXP devices with minimal effort. A key feature of this run-time inference engine is that it optimizes the system memory usage and data movement uniquely for each SoC architecture.
“Au-Zone is incredibly excited to announce this investment and strategic partnership with NXP, especially with its exciting roadmap for additional ML accelerated devices,” said Brad Scott, CEO of Au-Zone. “We created DeepView to provide developers with intuitive tools and inferencing technology, so this partnership represents a great union of world class silicon, run-time inference engine technology, and a development environment that will further accelerate the deployment of embedded ML features. This partnership builds on a decade of engineering collaboration with NXP and will serve as a catalyst to deliver more advanced Machine Learning technologies and turnkey solutions as OEM’s continue to transition inferencing to the Edge.”
Recommended AI News: NETGEAR Appoints Sarah Butterfass to Board of Directors
Expanding Machine Learning Acceleration
To accelerate machine learning in a wider range of Edge applications, NXP will expand its popular i.MX applications processors for the Industrial and IoT Edge with the integration of the Arm Ethos-U65 microNPU, complementing the previously announced i.MX 8M Plus applications processor with integrated NPU. The NXP and Arm technology partnership focused on defining the system-level aspects of this microNPU which supports up to 1 TOPS (512 parallel multiply-accumulate operations at 1GHz). The Ethos-U65 maintains the MCU-class power efficiency of the Ethos-U55 while extending its applicability to higher performance Cortex-A-based system-on-chip (SoC)s. The Ethos-U65 microNPU works in concert with the Cortex-M core already present in NXP’s i.MX families of heterogeneous SoCs, resulting in improved efficiency.
“There has been a surge of AI and ML across industrial and IoT applications driving demand for more on-device ML capabilities,” said Dennis Laudick, Vice President of Marketing, Machine Learning Group, at Arm. “The Ethos-U65 will power a new wave of edge AI, providing NXP customers with secure, reliable, and smart on-device intelligence.”
Recommended AI News: Paxful Teams Up With South Korean Cryptocurrency Giant Bithumb Global