Inspur’s Liquid Cooling HPC Brings Fast Deployment and High Performance to Scientific Research
Recently, at ISC High Performance 2021 Digital, Inspur’s new liquid cooling HPC solution was demonstrated by Western Science and Technology Innovation Harbour (iHarbour) at Xi’an City, an institution focused on promoting innovation in science and education. Inspur’s flexible liquid cooling solution is used by iHarbour to powers its scientific research platform.
The iHarbour HPC Platform has been designed to provide sufficient computing power for data-intensive scientific research. With Inspur’s systems that are equipped with over 10,000 CPU cores, 2nd Gen Intel Xeon Scalable Processors of 2.7GHz, and 100Gbps Infiniband high-speed network enabling non-blocking interconnection among nodes, the new HPC Platform achieves the computing capacity of over 1.1 petaflops and parallel storage capacity of 3PB.
Recommended AI News: Panion Announces Major Rebrand, Pivot to B2B SaaS Community Management Platform
To meet the computing demands required by the researchers at the institution, iHarbour’s IT team at needed to quickly deploy large numbers of computing nodes while maintaining an ideal energy efficiency ratio in the server room. The emerging heat dissipation technology of liquid cooling became an obvious answer due to its low energy consumption, low noise, and high efficiency. However, turning the traditional data center into a liquid cooling HPC center normally requires tremendous resources to build cooling towers, water chilling units, and cooling pipelines, which might take months. On top of that, these cooling devices require an enormous amount of space and have high maintenance costs (i.e., freeze-proofing in winter). iHarbour not only had to quickly deploy the liquid cooling HPC platform but had to do so without renovating the entire server room and avoid high maintenance costs in the future.
Recommended AI News: Shipsy sets sight at global expansion by deepening its Middle Eastern Presence, targets 3X client amplification
To solve the problem, iHarbour adopted Inspur’s Mobile Liquid Cooling HPC Systems equipped with Inspur’s i24 high-density liquid cooling servers, and mobile rack-mounted Coolant Distribution Unit (CDU). This marks the first case of this kind of liquid cooling in a large-scale HPC data center in China. The system uses industry-leading liquid cooling technology employing warm water for heat dissipation. It can stably operate at an ambient temperature of 35°C, guaranteeing the CPU to run stably in high-performance mode for a long period of time. Tests showed that The High-Performance Linpack(HPL) Benchmark performance and efficiency both showed an improvement over previous heat dissipation methods of air cooling. Every eight i24 computing nodes are attached to 1 rack-mount CDU, which can form liquid cooling clusters efficiently. This new system does not require server room renovation, can be deployed in standard server racks, greatly shortening the deployment time needed for a new system. The scaling-up of liquid cooling servers can be done by stacking units inside the rack without the need to set up water pipelines inside the data center, meaning there is little impact to businesses or researches, it is easily scaled up, and the maintenance cost is minimal.
Recommended AI News: E-Commerce Logistics SaaS tech startup zip24 raises US$1.2 million to fund its expansion
Since iHarbour’s HPC platform has been in operation, it has run about 1 million computing tasks and helped deal with application tasks across more than 20 disciplines, including materials, electricity, machinery, energy and environmental science. To date, thanks to the new HPC platform, researchers from iHarbour have published one paper on Nature, three on Science, and over 20 on journals with the impact factor over 10, and many more selected as cover papers in academic journals.
Recommended AI News: ShippyPro Secures $5 Million in Series A Funding From Five Elms Capital to Become the Only All-in-One Global Delivery Infrastructure for Ecommerce Companies
Comments are closed.