Data Storage Trends: File and Data Services in the Cloud
With the explosive growth of AWS, Azure and Google Cloud, it’s clear that many organizations are finding success with File and Data Services in the Cloud. This is a great move on their Cloud journey. The public cloud offers dynamic and scalable services where hyper-scale infrastructure solves customer needs, but requires using cloud-native APIs and architecting for application-level resiliency. New stack applications in the public cloud touch our everyday lives. Services like Netflix and Uber are built to get to market quickly and enable rapid iteration by DevOps teams.
What is Impossible in File and Data Services?
The cloud offers tantalizing simplicity, scalability, and reliability. But if your workload is file-based, you’re faced with an impossible choice. Either sacrifice enterprise features and performance with a cloud-only solution, or rewrite your workflow to function against object storage.
Traditionally, applications or workflows require costly re-development to use cloud-native APIs. Existing applications represent the next wave of cloud adoption, and there is an easier path today without significant redevelopment, taking advantage of cloud native services such as AI, IoT and Blockchain to “wrap” migrated apps or those stranded on-prem with new functionality.
Virtualization of compute resources in the public cloud was solved in the last decade, but the problems with data management remain. Today, industries like media and entertainment, silicon chip design, and oil and gas exploration want to utilize the near-infinite scale of public cloud IaaS resources. However, the data they want to process is in petabyte-scale solutions outside of the public cloud, accessed with an entirely different set of APIs and crucial for apps to access.
Relation Between Cloud Native Management and File and Data Services
Cloud-native data management APIs create simple storage services that are easy to consume as objects and buckets at almost any scale, but those require use of different data access models than most existing applications (which can require burdensome re architecture).
Read Also: Protegrity Achieves Certification on Cloudera Data Platform
File has been the de facto data management API for applications for more than thirty years with NFS for Unix/Linux or CIFS for Windows/MAC to access remote file resources. Workflows and services use file at the core for data access and movement. The file API offers a rich set of metadata, security and compatibility but, most importantly, it offers transparent access from almost any platform. We know there is a better way – one in which customers can use the power of cloud provider object and block services, while leveraging file at scale for all their data needs across hybrid environments.
Read Also: Collibra Launches as a Managed Service on the Google Cloud Platform
Petabyte-scale distributed systems manage the data lifecycle with reliability and protection. Public cloud providers offer basic support for file API-based services and lack the features many customers require today in their existing systems.
File and Data Services in the Cloud, initially designed so applications can burst into the cloud, such that a working set of data is copied into the file cloud service, the workload executes using the existing file API in the public cloud, and then the results are pulled back out of the public cloud. There is a level of sophistication required by customers to orchestrate data into the public cloud that is still a barrier.
In media and entertainment, where compute resources are critical to render special effects, GCP and AWS have built data centers near studios in Hollywood to eliminate data orchestration with low-latency private links to public cloud utility compute, connected directly to the studios existing data management systems. There is a need for persistent petabyte-scale, file-based solutions in the public cloud to bridge the needs of existing customer applications and workflows, that offer the agility and scalability of public cloud services.
Organizations want the ability to harness all of the public cloud IaaS and PaaS power and leverage their data no matter where it resides. Businesses no longer have to re-architect their apps when deciding to move them to public cloud. If a user chooses to leave an app on-prems, but move the data to the public cloud, they want to connect the data seamlessly, along with cloud provider services the data might be using, such as AI or IoT platforms.
File data of all types and sizes can now be moved easily to the public cloud or a mixed private and public cloud environment. With this approach, businesses are not tethered to hardware limitations to scale up or down instantly and build, deliver and innovate on the apps and services which matter most to their customers.
Comments are closed, but trackbacks and pingbacks are open.