Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

New Study Highlights Disconnect Between Enterprise Data Access Requirements and Current Capabilities to Make Data-Driven Decisions

Research sponsored by Starburst and Red Hat shows data access is more critical for 53% of respondents since the pandemic

New market research commissioned by Starburst and Red Hat, shows that data access has become more critical for 53% of survey respondents throughout the pandemic as analytics workloads and demands increase significantly. The survey, conducted by independent research firm Enterprise Management Associates, found that the imperative for faster data access is about driving business outcomes, with 35% of survey respondents looking to analyze real-time changes to risk and 36% wanting to improve growth and revenue generation through more intelligent customer engagements.

Recommended AI News: Radware Named a Leader in DDoS Mitigation Solutions by Independent Research Firm

More than one-third (37%) of survey respondents aren’t confident in their ability to access timely, relevant data for critical analytics and decision-making. “The State of Data and What’s Next” survey revealed that this lack of confidence stems from four main challenges currently facing data teams:

  1. Data is significantly distributed and this complexity isn’t going away. The survey found that half (52%) of respondents have data in five or more data storage platforms. And more enterprises are expected to follow this trend, with 56% of respondents claiming they will have data on more than five platforms in the next year. It appears distributed data is here to stay.
  2. Moving data across sources is riddled with challenges. Many enterprises are finding the task of building and deploying data pipelines difficult. Some of the biggest obstacles they face are combining data in motion with data at rest (32%), the excessive time it takes to address break and fix (30%), data pipeline complexity (26%) and the manual coding lift for deploying error-free data pipelines (25%).
  3. Developing a data pipeline is time intensive. Building the right infrastructure for accessing data takes significant time and manpower, and enterprises can’t afford to wait. For 45% of survey respondents, it currently takes more than a day to develop a data pipeline, with 27% saying it can take anywhere from three days to two months. Not to mention making that data pipeline operational takes more time – 52% of respondents said that it adds another day or more, with 24% saying it adds another week to the process.
  4. Every second between querying data and gaining insight counts. With the rapidly changing landscape of almost every industry due to the pandemic, today’s enterprises have significantly less time to gain insight on their data before it’s outdated. Viability of business decisions comes down to a matter of milliseconds (according to 17% of respondents), with 39% of business decisions requiring latency of one second or less.

“Data is the lifeblood of any business trying to navigate today’s digital economy. What we’re seeing in these survey results is that organizations have clear demands for faster and more comprehensive data access, but technical challenges still exist,” said Justin Borgman CEO of Starburst. “It’s imperative that companies overcome these challenges because their customer experience, competitive advantage and growth depend on it. By empowering our customers to access all of their data, regardless of location, Starburst enables better and faster business decisions.”

Related Posts
1 of 40,833

The survey shows that enterprises are turning to specific practices and product capabilities to meet these challenges:

  1. Multi-cloud flexibility. Among survey respondents, 56% of their data is in the cloud and 44% is on-premises. The move to the cloud is quickly progressing, however, with respondents expecting to have 62% of their data in the cloud and 38% on-premises by the end of 2021. And this migration is not to one specific cloud provider: the number one criteria respondents looked for in cloud data storage was the flexibility to access data from multiple clouds (47%).
  2. Automation, search & cataloging. When asked, almost half (44%) of respondents identified automating IT and data operations as the most important practice to improve their organization’s data strategy. Survey respondents also identified implementing search (32%) and cataloguing data (30%) among the most important practices, pointing toward the necessity for organizations to make finding and using the right data a quick and easy process for users across the enterprise.
  3. Modern analytical capabilities. Legacy data solutions are no longer meeting the needs of the modern enterprise – data analysts need future-facing tools that enable the processes they are engaging in most. For 42% of respondents, that is running a single query across relational databases, file systems and object storage. Data analysts are also looking for the most support in analyzing streaming data or real-time events (38%) and running a single query across structured and semi-structured data (37%).

Recommended AI News: DISH Selects Amdocs’ Cloud-Native B****** Platform to Support 5G Services

“The opportunity for every organization is to convert data into action,” said Mike Piech, Vice President and General Manager of Cloud Storage and Data Services at Red Hat. “But the challenge we see organizations facing is a veritable tsunami of diverse data in their hybrid cloud environments, with a range of users needing access to it, including application developer and DevOps teams, data science and data engineering teams, and cloud infrastructure teams. The imperative is to create a consistent foundation for storage that can handle a wider range of workload types, from structured databases and semi-structured data warehouses to unstructured data lakes. With this foundation, organizations can be better positioned to harness insights from their data, no matter where it comes from in their hybrid cloud environment.”

“It was clear even before this research that enterprise technology needs have drastically shifted over the past year,“ said John Santaferraro, Research Director at Enterprise Management Associates. “This research identifies that there is a chasm between data requirements and current capabilities, pinpoints what is most needed to bridge that chasm, and provides the roadmap for how technology solutions will need to evolve to meet these needs.”

Recommended AI News: Okta Signs Definitive Agreement to Acquire Auth0 to Provide Customer Identity for the Internet

3 Comments
  1. Copper recycling center says

    Copper recycling innovations Industrial copper reprocessing Metal scrap disposal
    Copper cable recovery, Metal scrap buyers, Industrial copper decommissioning

  2. Iron waste drop-off says

    Scrap metal reconditioning Ferrous waste reclamation and reuse Iron scraps reprocessing and recovery

    Ferrous scrap certification, Iron repurposing services, Metal waste reconstruction

  3. Iron waste redistribution says

    Metal residue utilization Ferrous material recycling inspections Iron recycling center

    Ferrous material processing equipment, Iron scrap processing facility, Metal scrap recovery and reprocessing

Leave A Reply

Your email address will not be published.