Confluent Introduces New Capabilities to Enable Organizations to Adopt Event Streaming for All Mission-Critical Production Environments
Latest Confluent Platform Features Help Further Event Streaming Deployments With Security, Resilience and Compliance Needed for Critical Enterprise Environments
Confluent, Inc., the event streaming platform pioneer, announced the release of Confluent Platform 5.4, which is focused on helping organizations on their path to making event streaming the central nervous system of their business. With a highly secure and resilient event streaming platform, organizations can now confidently bring information from every part of their business to a central platform to become more agile and event-driven.
“To advance our goal of enabling developers, operators and architects to reap the benefits of event streaming, we’ve built Confluent Platform 5.4 to be the quickest route for enterprises to utilize event streaming at massive scale with the ease and security they require.”
As a new category of data infrastructure is created around event streaming, more than half of the Fortune 100 and thousands of companies globally are realizing the benefits of harnessing the value of events as they happen. Many large organizations are looking to quickly extend the number of teams, use cases and environments that rely on the power of event streaming. In turn, the list of requirements needed to operate Apache Kafka at this new level has grown. Today’s enterprises are faced with stricter demands around security and compliance, and event streaming platforms must evolve to meet these expectations in order for enterprises to successfully and efficiently grow their deployments.
Recommended AI News: BlockchainArmy Plans to Manage Space Debris Through Blockchain; Launches SOS Program
“Increasingly demanding consumers and intensifying digital competition are pushing analytics from transactional to continuous. To achieve the necessary continuous intelligence, data and analytics leaders must understand and master the event stream processing market,” according to Gartner, Inc.’s “Market Guide for Event Stream Processing,” Nick Heudecker, et al, 7 August 2019.
Confluent is dedicated to building an event streaming platform that helps organizations achieve their goals of leveraging real-time event data to deliver innovative customer experiences and unlock game-changing internal efficiencies, regardless of the makeup of their IT environments. With Confluent Platform 5.4, organizations now have the security, resilience and storage they need to extend what’s possible with their event streaming platform for their business and customers.
“Enterprises are expected to deliver cutting edge digital experiences; however, their growing list of production prerequisites often holds them back from fully leveraging the event streaming capabilities needed to power those experiences at scale,” said Ganesh Srinivasan, vice president of product and engineering, Confluent. “To advance our goal of enabling developers, operators and architects to reap the benefits of event streaming, we’ve built Confluent Platform 5.4 to be the quickest route for enterprises to utilize event streaming at massive scale with the ease and security they require.”
To reduce the complexity of securing and deploying event streaming for any company, Confluent is delivering the following enhancements in Confluent Platform 5.4:
Enterprise-Grade Security for Accessing Critical Resources Across Multiple Teams
Role-Based Access Control: Platform-Wide Security with Fine-Tuned Granularity
As enterprises continue to grow and expand event streaming use cases, the number of connected systems and users increases along with potential risks from internal and external forces. To enforce more rigorous security authorizations, Confluent Role-Based Access Control provides a centralized framework for granting permissions by users and groups, starting at the cluster level and moving all the way down to individual topics, consumer groups or even individual connectors. Now Kafka operators can delegate the responsibility of managing access permissions to true resource owners, ensuring security at scale across all platform components (Control Center, Kafka Connect, KSQL, Schema Registry, REST Proxy, and MQTT Proxy).
Recommended AI News: The Future of Fintech at CES 2020 with AI, Crypto, Threat Intelligence and So Much More…
Structured Audit Logs: Security Traceability and Regulatory Compliance
Until now, activity happening in an event streaming platform was a black box for Kafka operators, which made it difficult to trace actions taken by users, detect abnormal behavior and identify potential security threats. Confluent Structured Audit Logs solves this issue by enabling operators to capture authorization logs in a set of dedicated Kafka topics. They can conduct deeper analysis on these logs to improve their systems and processes using Kafka native tools, such as KStreams or KSQL, or by offloading them to external systems with a sink connectors pre-built by Confluent. Confluent is the first vendor to support CloudEvents, ensuring that Structured Audit Logs meets the industry’s standard for describing event data in a common way. This makes it easier to address compliance requirements around information security, which can accelerate enterprises’ ability to extend the power of Kafka to a wider range of new use cases.
Fully Automated Disaster Recovery to Improve Reliability
Multi-Region Clusters: Disaster Recovery and Multi-Site Deployments for Kafka
For many global organizations spanning a wide range of geographic regions, running a single stretch Kafka cluster across multiple data centers meant tradeoffs in durability, availability and latency. So teams often resorted to replication across multiple clusters, which introduced inefficiencies in the recovery process due to coordination amongst multiple client teams. Confluent Multi-Region Clusters overcomes those tradeoffs by stretching a single Kafka cluster across multiple regions using synchronous and asynchronous replication at the topic level. By leveraging Kafka’s internal replication, Multi-Region Clusters enables automated client failover, which fundamentally improves Recovery Time Objectives (RTOs) and significantly streamlines disaster recovery operations.
Recommended AI News: Moody’s Analytics Wins an Artificial Intelligence Award
Greater Control Over Data Quality to Increase Reliability of Entire Kafka Ecosystem
Schema Validation: Centralized Way to Control Data Compatibility
To ensure the reliability of an organization’s Kafka ecosystem, control over the quality of data being written to the system is critical. Otherwise, incorrectly formatted data can be published to Kafka causing the system to be corrupted and open to failure. Kafka operators now have the option to enable Confluent Schema Validation at the topic level to ensure data compatibility across the platform as defined in Confluent Schema Registry. This is especially helpful for large organizations with multiple users and groups using Kafka to build event streaming applications.
Object Storage Offload to Unlock Limitless Data Retention
Tiered Storage: Infinite Data Retention and Increased Elasticity
With more systems connected to Kafka and new event streaming apps and microservices developed on it, enterprises are required to store larger amounts of data for longer periods of time. That introduces prohibitively higher costs and difficulties scaling as compute and storage are tightly coupled. Confluent Tiered Storage allows brokers to recognize storage in two tiers: local disks and object stores. Offloading older data to object storages drastically improves elasticity and enables infinite retention so enterprises can more freely and cost-effectively expand the use of event streaming, perform year-over-year analytics and replay data to meet compliance requirements.