Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

San Diego Supercomputer Center Supersizes Advanced Storage with Qumulo for its Data-Intensive Infrastructure

Qumulo, the leading provider of enterprise-proven hybrid cloud file storage, announced that the San Diego Supercomputer Center (SDSC), a forerunner in providing both data-intensive computing and cyber infrastructure offerings to the national research community, has selected Qumulo distributed file system to offer to its scientific research customers.

SDSC is a leading managed services provider (MSP) for the scientific community in government, academia, and business. As a research unit at the University of California, San Diego, SDSC uses its on-prem supercomputers to run advanced computation and all aspects of big data storage and analysis, including data integration, performance modeling, data mining, and predictive analytics.

Recommended AI News: Mobile Advertising Soared 71% Year-over-Year, According to PubMatic Report

The Center is a member of XSEDE (eXtreme Science and Engineering Discovery Environment), a single virtual system that enables researchers to interactively share computing resources, data collections, and advanced research tools.

The Center’s data-intensive gateways and client technology stacks must support high-performance and high-capacity data storage for massive amounts of big data – much of it unstructured. Although the Center’s supercomputers easily handle computing tasks, the neuroscience storage systems lacked massive scale-out capacity and the storage features necessary to support big data, fast access, and advanced analytics.

Related Posts
1 of 19,664

“Our storage requirements for data projects are growing from tens of terabytes to hundreds of terabytes,” said Amit Majumdar, Ph.D., director of data-enabled scientific computing at SDSC. “Large data transfer and storage, high-speed access, sharing, search functionalities — all of these are becoming more and more important for our projects.”

To successfully meet its MSP client requirements, SDSC selected Qumulo’s file storage solution because of its ability to provide an optimal balance of performance, capacity, scalability, durability, and advanced functionality.

Recommended AI News: HashCash to Provide UAE Exchange With Crypro Trading Bot Infrastructure

“We’d been searching for a reliable, resilient, cost effective solution for our customers with large-scale data needs,” said Michael Norman, director, SDSC. “In times past, we traded staff time for purchased solutions to meet tolerable price points. We were elated to partner with Qumulo and find we didn’t have to spend years building a DIY solution.”

“Qumulo has been incredibly easy for SDSC to manage,” said Brian Balderston, director of infrastructure, SDSC. “Instead of having to focus our manpower and resources on managing a number of inefficient storage systems, we are able to allocate and focus our engineering time to work on securing highly impactful and well-funded grants from the National Science Foundation, National Institutes of Health, and other funding agencies. That is a big win for the Center.”

Recommended AI News: LoopMe Bolsters Data Advisory Board with Powerhouse Addition of Rishad Tobaccowala

Leave A Reply

Your email address will not be published.