Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

Organizations Rush to Use Generative AI Tools Despite Significant Security Concerns

89 Percent consider GenAI tools to be a potential security risk

New research from Zscaler, Inc., the leader in cloud security, suggests that organizations are feeling the pressure to rush into generative AI (GenAI) tool usage, despite significant security concerns. According to its latest survey, “All eyes on securing GenAI” of more than 900 global IT decision makers, although 89% of organizations consider GenAI tools like ChatGPT to be a potential security risk, 95% are already using them in some guise within their businesses.

AIThority Predictions Series 2024 bannerEven more worryingly, 23% of this user group aren’t monitoring the usage at all, and 33% have yet to implement any additional GenAI-related security measures – though many have it on their roadmap. The situation appears particularly pronounced among smaller-sized businesses (500-999 employees), where the same number of organizations are using GenAI tools (95%), but as many as 94% recognize the risk of doing so.

Recommended AI News: FPT Software and Silicon Valley’s AITOMATIC Form Strategic Partnership on Industrial AI

“GenAI tools, including ChatGPT and others, hold immense promise for businesses in terms of speed, innovation, and efficiency,” emphasized Sanjay Kalra, VP Product Management at Zscaler. “However, with the current ambiguity surrounding their security measures, a mere 39% of organizations perceive their adoption as an opportunity rather than a threat. This not only jeopardizes their business and customer data integrity, but also squanders their tremendous potential.”

The rollout pressure isn’t coming from where people might think, however, with the results suggesting that IT has the ability to regain control of the situation. Despite mainstream awareness, it is not employees who appear to be the driving force behind current interest and usage – only 5% of respondents said it stemmed from employees. Instead, 59% said usage was being driven by the IT teams directly.

Related Posts
1 of 40,892

“The fact that IT teams are at the helm should offer a sense of reassurance to business leaders,” Kalra continued. “It signifies that the leadership team has the authority to strategically temper the pace of GenAI adoption and establish a firm hold on its security measures, before its prevalence within their organization advances any further. However, it’s essential to recognize that the window for achieving secure governance is rapidly diminishing.”

Recommended AI News: 77 Percent Of Organizations Are Investing in AI Solutions to Bolster Their Quality Engineering

With 51% of respondents anticipating a significant increase in the interest of GenAI tools before the end of the year, organizations need to act quickly to close the gap between use and security.

Here are a few steps business leaders can take to ensure GenAI use in their organization is properly secured:

  • Implement a holistic zero trust architecture to authorize only approved AI applications and users.
  • Conduct thorough security risk assessments for new AI applications to clearly understand and respond to vulnerabilities.
  • Establish a comprehensive logging system for tracking all AI prompts and responses.
  • Enable zero trust-powered Data Loss Prevention (DLP) measures for all AI activities to safeguard against data exfiltration.

Recommended AI News: ConnectWise Delivers Revolutionized Experiences Across Business Management Solutions

[To share your insights with us, please write to sghosh@martechseries.com]

Comments are closed.