Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

Data Science Insights: 96% of Data Professionals Are at or Over Capacity

Organizations are investing heavily in various data science technologies to augment their Big Data analytics and predictive intelligence capabilities. But, the growing investments are putting immense pressure on the existing data science teams and resources. According to a recent report, 96% of teams have reported they are already at or over their work capacity.

Businesses to Buy New Data Science Products and Tools

Boost to Implement Automation Technology with Hiring Investments

Ascend.io, the data engineering company, announced results from its second annual research study, The 2021 DataAware Pulse Survey, about the work capacity and priorities of data teams, including data analysts, data scientists, data engineers, and enterprise architects. Conducted in Q2 2021, findings from more than 400 U.S.-based data professionals reveal key insights on their teams’ current workload, productivity bottlenecks, and their organizations’ accelerated need for data products. Demonstrating the current pressures many data teams face, the research found that 96% of teams are at or over capacity – down only 1% from the 2020 study.

Business Demands for Data Projects Accelerate Faster Than Team Capacity

The vast majority (93%) of respondents anticipate the number of data pipelines in their organization to increase between now and the end of the year – with 56% projecting the number of data pipelines to increase by more than 50%. This surge appears to be affecting some roles more than others, with data scientists 2.4x more likely than other roles to anticipate significant growth in pipeline demands.

Encouragingly, the majority (79%) of respondents indicated that their infrastructure and systems are able to scale to meet their increased data volume processing needs, which further highlights that the new problem with scale is focused on team capacity and less on technology capacity.

New Adobe And Epsilon Report: Lack Of Data Maturity Hampers CDP Success

Amid the significant increase in the number of data pipelines across their organization, 74% of data professionals indicated their organization’s need for data products is growing faster than their team size. This was especially true for data engineer respondents, of which 81% cited the need for data products was increasing at a faster rate than their team size.

“Data pipelines are fueling nearly every data-driven initiative across the business,” said Sean Knapp, CEO and founder of Ascend.io. “However, as innovations at the infrastructure layer continue to enable processing of greater volumes and velocities of data, businesses face a new scaling challenge: how to enable their teams to achieve more, and faster. Our research shows that team sizes are not scaling at a fast enough rate to keep up with the needs of the business. Combined with our data that highlights almost every data professional today is already at capacity, this leaves little room for strategic work and innovation.”

Related Posts
1 of 930

Lack of Data Engineering Resources Leads to Downstream Delays

When asked which tasks in their data ecosystem are the most backlogged, respondents were significantly more likely to identify their own function of the business compared to their peers – with enterprise architects 2.7 times more likely to indicate data architecture as the most backlogged, data analysts 4.5 times more likely to indicate data analysis, and data scientists 5.1 times more likely to indicate data science. Data engineers were an astounding 7.0 times more likely to indicate data engineering as the most resource-demanding function of the business.

Why You Should Use Cloud Technology?

Further evidence points to a growing backlog of data engineering tasks across teams. In fact, the most common bottlenecks across teams include the maintenance of existing and legacy data systems (39%), data system setup and prototyping (30%), and need to ask for access to data or systems (26%). Among the respondents, data scientists reported the greatest need to ask for access to data at 39%.

Knapp continued, “Across the board, our research shows that data engineering has become a significant roadblock for many teams. Simply put, data engineers are overburdened with building and maintaining mission-critical yet fragile pipelines. This can lead to significant delays for downstream users, as well as drive many teams to address data engineering bottlenecks by attempting to self-serve the data they need access to, often with mixed results, depending on the maturity of their in-house platforms. As a result, each team becomes overwhelmed with their own workload – a dynamic that can quickly lead to organizational friction, if not properly addressed, as well as inhibit the ability to meet the data needs of the business.”

Teams Turn to Automation, Yet Are Wary of No-code Tools for Data Science

Data teams are seeking technology solutions to overcome their bandwidth limitations and do more, faster. When asked how they plan to increase bandwidth across their team, more than half of respondents plan to buy new products or tools (53%) or implement automation technology (53%). Additional means to increase bandwidth include hiring more staff (47%) and re-platforming and retiring legacy technologies (30%).

As data teams evaluate new solutions, concerns remain around the usability of low- and no-code technology. According to the 2020 study, 80% of respondents reported they were already using or considering low-and no-code solutions to increase their team’s bandwidth. Interestingly, the 2021 research indicates that only 4% of data professionals prefer a no-code user interface. However, that number jumps to 73% if the solutions offered the flexibility of using both low- and no-code user interfaces in conjunction with higher code options, signaling a tremendous surge in interest for flexible coding (flex-code) approaches.

“In recent years, there has been a marked increase in the number of low- and no-code technologies available in the market,” said Knapp. “Despite the many benefits of these tools, businesses today are quickly discovering they also carry serious limitations. Many low- and no-code solutions don’t allow teams to customize code when needed for more complex business logic. The end result is that while making 95% of the job easier, the last 5% becomes impossible. With this in mind, it’s no wonder that 73% of data teams indicated they would be more likely to use a low- or no-code tool if it offered the ability to use their preferred programming language.”

[To share your insights, please write to us at sghosh@martechseries.com]

Comments are closed.