Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

State of Data Report Emphasizes Emerging Shift to a Decentralized Model

Starburst and Red Hat uncovers that 55% of organizations claim the pandemic has made data access more critical, a slight increase from the 2021 study. As a result, enterprises plan to prioritize multi-cloud flexibility and ease of use when it comes to selecting data infrastructure solutions. The second annual report, “The State of Data and What’s Next, conducted by independent research firm Enterprise Management Associates (EMA), found that the shift to quick and flexible deployments is imperative for driving the business functions and insights required to deliver valuable customer experiences in today’s fast-paced, distributed environment.

The new research indicates that surveyed global organizations are increasingly relying on data, with pressures for data access growing to meet customer demands in an advancing digital, highly mobile landscape. It points to a major disconnect between the critical demand for faster data access and having a centralized data strategy, indicative of three main challenges facing data teams today:

  • Data Sprawl Has Grown in Complexity: Survey respondents reported that their organizations currently use an average of 4-6 data platforms (43%), with an average of 11% employing 10-12 platforms. Furthermore, organizations are adding new data types to their environments at an increasing rate, citing streaming data (65% of respondents) as the most popular data type they plan to collect in the next year, followed by video and event data (60% of respondents).
  • Pains in the Pipeline Process Persist: In the wake of the COVID-19 pandemic, organizations must now accelerate data-driven decision-making to keep pace with ever-shifting customer expectations. However, for over 48% of survey respondents, it takes more than 24 hours to create a data pipeline, then another 24 hours (for 51% of respondents) to move data pipelines into production, making real-time business operations a challenge. This, combined with the need for faster data access, is pushing the industry away from the painful pipeline process and into a more decentralized model, or Data Mesh.
  • AI/ML Are Increasing and Placing Greater Pressure on a Variety of Systems: EMA is seeing a shift wherein data science (ML and neural networks) is rated as the most important analytical workload, which is applying increased pressure on already complex data platforms. Organizations need to process vast amounts of AI/ML data to fuel these workloads, but 31% of respondents said that data constantly being moved and changed makes finding data difficult. Along with the struggle to find data science resources, more automation of data science workloads and better data access is required to improve AI/ML models and reduce resource demands.

Recommended AI News: Infosys to Acquire Digital Experience and Marketing Agency, oddity

“Data is the lifeblood of every business in the digital economy. This year’s study highlights that organizations have clear demands for faster and more secure data access, but traditional approaches amidst increasing ecosystem complexity are holding teams back,” said Justin Borgman, CEO and co-founder of Starburst. “As data sprawl and distributed data challenges continue and pains in the pipeline process persist, enterprises are turning to new approaches, such as a Data Mesh architecture, that support a decentralized model. By empowering our customers to access all of their data, regardless of location, Starburst enables organizations to drive better and faster business decisions without the time-consuming and resource-intensive ETL process.”

Related Posts
1 of 40,761

The research findings also show that enterprises are turning to specific practices and product capabilities to meet these challenges:

  • Prioritize Faster Data Access: As a result of the challenges over the last two years, organizations must become nimble in providing fast, reliable access to data anytime, anywhere. The survey results show that demands on customer experience (33%), the ever-growing challenge of staying ahead of risk and market swings (29%), and employee engagement (29%) are the driving factors for these shifts.
  • Shift to a More Decentralized Model: Centralizing data certainly holds its benefits, including consolidated cost, high level of control, and ease of management. However, centralization also comes with increased risk, with a single point of failure and limited flexibility. This lack of flexibility can leave a business slow to adapt in a rapidly changing environment.
  • Automation of Critical Technology Systems: With data spread throughout organizations, the number one challenge enterprises face is the increased complexity of a hybrid, multi-cloud environment (40%). Organizations are looking to automation to maintain a competitive edge in the industry (38%), with a focus on the underlying data infrastructure with the automation of data pipelines, the adoption of intelligent search (33%), and the implementation of AI and ML in data processing to drive business decisions (32%).
  • Multi-Cloud Flexibility: Respondents cited that in 2021, 56% of their data was in the cloud compared to 44% on-premises. When asked the same question this year, respondents stated that 59% of their organization’s data resides in the cloud compared to 41% on-premises. The growing move to the cloud points to why multi-cloud flexibility (43%) remains the number-one impact on buying decisions with hybrid interoperability making a significant jump to 34%, up from 26% in 2021.

“Customers creating AI/ML enabled applications must rely on accessible data in order to accelerate model development and the deployment of intelligent applications across hybrid multicloud environments,” said Steven Huels, senior director, Software Engineering, Red Hat. “By creating a foundation for data and applications on cloud architecture, developers and data scientists can more quickly and repeatedly meet their business goals through the delivery of data-driven, intelligent applications.”

Recommended AI News: Vention Expands Overseas with New European Headquarters in Germany

[To share your insights with us, please write to sghosh@martechseries.com]

Comments are closed.