Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

Key Trends Driven by Data and Computing in 2023

From applied observability to hyper automation and the metaverse: the new data-driven technologies set to drive business change in 2023

In this article, I will explain the four data-driven technologies, from a macro to micro level, that will drive change over the next 12 months and beyond. They will enable every business to make greater use of their data, improve business efficiency and serve their end customers better and faster than ever before. 

Applied Observability – getting down to the detail of what’s happening across a business

Observability has grown up, breaking out from its infant stages as a tech-focused term to something that organizations realize will provide the key to keeping track of key data events in an increasingly de-coupled business world spanning systems architecture to the business operations it supports.

Moving forward, we now have the next level, “Applied Observability”, recognized by Gartner as a key 2023 strategic tech trend at its most recent Orlando Symposium.

Applied Observability enables organizations to exploit their data artifacts for competitive advantage. That means being able to ensure that the right data is delivered at the right time for rapid action, based on confirmed stakeholder actions rather than intentions. “Observable” data include key digitized artifacts, such as logs, traces, API calls, dwell time, downloads and file transfers, that appear when any stakeholder takes any kind of action. Applied Observability feeds these observable artifacts back in a highly orchestrated and integrated approach to accelerate organizational decision-making and allows the business owner to track how long an action took to fully process at every point in the workflow. By having this visibility into end-to-end workflows, businesses can gain better visibility into how long their systems took to process a workflow and get real-time insights into bottlenecks or areas for improvement to ensure their systems function optimally.

Event-driven architecture and distributed tracing aid applied observability

But unearthing these insights requires an event-driven approach to software architecture. Distributed tracing, for example, is a method of tracking application requests as they flow from front end devices to backend services and databases. By definition, and in contrast to traditional tracing, distributed tracing can be visualized to show a searchable, graphical picture of when, where, and how a single event flowed through an enterprise, regardless of the number of hops the workflow took to fully process.

Consider the potential traceability and end-to-end observability benefits across a payment ecosystem underpinned by event-driven architecture. Embedding distributed tracing into an event mesh emits trace events in OpenTelemetry format so banks can collect, visualize and analyze them in any compatible tool. This empowers them to not only confirm that a given message was published, but easily understand exactly when and by whom, where it went, down to individual hops, who received it and when…or why not.

When planned strategically and executed successfully, Applied Observability is arguably the most powerful source of data-driven decision-making. And, since OpenTelemetry is a well-defined open standard, it can be implemented across both synchronous and asynchronous workflows.

Hyper Automation & Intelligent Automation: taking automation to the next level

Gartner and IDC identify both hyper and intelligent automation in their yearly technology trends for 2023. Hyper automation moves automation up a level, adding more intelligence to automation and using a broader set of tools so that previously un-automatable tasks can be automated. Hyper automation initiatives come in many different shapes and sizes and are being seen across a wide range of industries, from banking and insurance to manufacturing and healthcare.

Gartner points to U.S. healthcare company CVS Health as a prime example, having taken advantage of hyper automation to simplify its unwieldy benefits administration processes to improve efficiency, accuracy and customer service. A new system was developed to streamline tasks from application receipts and payments to issue resolution. These were cross-functional, largely manual and time-consuming tasks beforehand, which involved analyzing data in a wide range of formats and aligning with complex coding rules. However, using a combination of AI, RPA, machine learning, data analytics and natural language processes (NLP), the company was able to automate much of this work.

Recommended SaaS Insights: Introduction: Understanding ABM in Five Steps

As consumers of goods and services continue to demand faster and better customer service, companies need to overlap development cycles and find ways to cost-reduce portions of what they deliver to stay ahead of the curve. Removing the friction that slows down implementation teams is key to success with end-customers.

Putting automation on the runway to success

But hyper automation requires an organization to be underpinned by an event-driven architecture (EDA) and – more specifically, an event mesh – to maximize success. An event mesh is an interconnected network of event brokers that allows data to be pushed in real time to parts of the organization where the data is needed. It does this dynamically, meaning that new event types can be added any time, and interest in events can be registered allowing seamless interchange of data for the applications that are interested in using it.

Take the aviation industry as an example. An event mesh streams information such as flight routes, delays, cancellations, and mileage accruals between applications, connected devices and people anywhere in the world, instantly. With an event mesh, information about events can be continuously streamed to multiple systems, filtered so each system only receives the data it needs. This ultimately enhances the customer experience as passengers, pilots, and crews are notified in real time when something relevant occurs across on each and every flight.

Or take the working example of Heathrow Airport. After downsizing its IT department during the pandemic, it needed to reduce its dependency on IT to deliver solutions. The answer was to pivot the department to the role of orchestrator, building a low-code/no-code community of practice that enabled employees across the wider business to build-in their own automation for health and safety apps to support safe return to work or live audits.

As of last August, Heathrow’s hyper automation efforts had garnered huge savings in potential outsourcing costs, reduced paperwork by 120,000 pages and decreased manual data entry hours by more than 1,170.

Related Posts
1 of 1,041

The acceleration of the event-driven API Economy

The API economy has exploded with the proliferation of web applications and with the rise of digital businesses that need to expose and connect applications and assets using familiar architectural patterns and protocols such HTTP/ Representational State Transfer (REST). Research from MIT Initiative on the Digital Economy has shown that over a four-year period, businesses using APIs saw 12.7% more growth in market capitalization compared to those that did not adopt APIs.

Top Tech Insights:

Edge Adoption is Growing Across Industries—Here’s Why

But the API economy is changing, as event-driven architecture and asynchronous event-driven APIs are becoming increasingly important to companies that are striving to make their business processes, customer interactions and supply chains more real-time. Using both synchronous and asynchronous methods can result in an application environment in which system resources are most effectively used.

Merging synchronous and asynchronous APIs – this is not a binary choice

AsyncAPI and RESTful APIs will be the next evolution of the API Economy. API Management Platforms will need to quickly adapt to this new reality. As a prime example, Forrester recently spotlighted a global biotech company that is blazing a trail in the API-driven area. As the case study demonstrates, the organization understood that EDA is a complement to RESTful APIs – another tool in their digital business toolbox, filling in the gaps of a REST-only approach.

As a result, they planned a unified event + REST approach from the outset. Both APIs and events are managed as enabling digital products, with the governance and lifecycle of both unified as one process. A platform team builds a unified API + event platform for app dev teams rather than building each as a separate siloed platform. The results include faster speed to market and new business opportunities. The company also embraces FAIR data principles: data that is Findable, Accessible, Interoperable, and Reusable. A digital marketplace consisting of both events and APIs makes the data findable and machine-readable via Open API Spec (OAS) and AsyncAPI metadata. Data can then be accessed on demand via APIs and events push data in real-time.

From this example it can be clearly seen that modern enterprises realistically require a “one stop shop” platform to best suit their unique application integration strategies – one that can deliver unified management to access, reuse and exposing both asynchronous event-driven and synchronous RESTful APIs.

Blurring the physical and digital words; the Metaverse will continue at pace

Zooming out a bit, there is no getting away from the increasing prevalence of the Metaverse.

It is clear to see its potential to reshape everything from consumer habits to public services to health, welfare and education from an emerging digital universe where providers, creators and consumers can experience a parallel life to their real-world existence. Driven by technologies including digital twins, augmented reality and virtual reality, McKinsey estimates the Metaverse will be worth $5 trillion by 2030.

Real-time data underpins the Metaverse – must go with the flow, just like real world

Data in real time will be the vital common denominator to link the digital and physical divide and optimize the Metaverse. McKinsey lists real-time data as a vital element to facilitate adoption of the Metaverse in several key industries. These include Retail – to enhance shopping/in-store/product experience, capture efficiencies, and explore net-new revenue streams; Banking/Financial Services – to support Decentralized Finance structures; Transportation – to allow central coordination and project management (e.g., via IoT/digital twins), especially in logistics, with real-time data collection for optimization; and Healthcare – enabling fully-personalized health consultations, with access to real-time data.

The Metaverse by definition cannot be static. It needs to be in real-time, in motion. It needs an event-driven architecture and a real-time event mesh to support it.

For example, if an avatar visits a retailer in the Metaverse, the retailer will need to keep track of countless user actions such as how long someone spent looking at a particular product, which products got the most interest etc. The Metaverse will produce a significant amount of data that companies will be interested in analyzing in real time. This data can be used to make decisions from real time adjustments to inventory to incentivizing commerce through real time offers/discounts/coupons and even dynamically pricing products.

Data-driven developments shape 2023

The four trends discussed above collectively offer the potential to reshape the way we do business, interact with customers and partners, and make data-driven decisions. The common thread across these trends is the need for better movement of data, in real time, in motion, and all pointing to the need for an underlying event-driven architecture. To ensure success, organizations must adopt an event mesh underpinned by a proven strategy to designing, managing and governing events as they flow through their enterprise.

[To share your insights with us, please write to sghosh@martechseries.com]

Comments are closed.