Confluent Launches New Fully Managed Confluent Cloud Connectors in Confluent Hub
With new connectors and ksqlDB pull queries, Confluent Cloud is the only complete solution for cloud-native event streaming
Confluent, Inc., the event streaming pioneer, announced new fully managed connectors for Confluent Cloud to seamlessly integrate events across cloud, on-premises, or hybrid environments, and the launch of ksqlDB pull queries in Confluent Cloud for streamlined processing of event data. This supports the latest theme for Project Metamorphosis – Complete. With connectors for a full ecosystem of on-premises systems, top cloud services, and popular enterprise tools, organizations can easily blend and analyze data from across their business, enabling more personalized customer experiences and improved operational efficiencies. Paired with ksqlDB, organizations can bring sophisticated, real-time applications to market faster than ever and save months of engineering time and resources.
“When an event streaming platform is an organization’s central nervous system for all real-time data, applications become more agile, intelligent, and responsive,” said Jay Kreps, co-founder and CEO, Confluent. “Our platform comes with over 120 production-ready Kafka connectors, so any organization can realize the power of a complete event-driven central nervous system without all the hassle.”
Recommended AI News: Lightspeed Announces Acquisition of Upserve
Expectations for highly responsive and real-time digital experiences are on the rise, especially as more people rely on applications for everyday tasks. In order to build these applications, organizations are adopting more modern, cloud-based infrastructure on top of legacy systems. According to IDC, “over 90% of enterprises worldwide will be relying on a mix of on-premises/dedicated private clouds, multiple public clouds, and legacy platforms to meet their infrastructure needs.”1 Data silos become a major problem with this combination of new and long-standing environments, leaving organizations unable to integrate critical information into new applications. Apache Kafka® is the backbone of many real-time applications, however, it requires 3-6 months on average for a full-time, dedicated engineer to build, test, and maintain one-off connections to each environment.
“Event streaming played a critical role in modernizing the University of California San Diego’s digital experiences and processes for students and staff,” said Brian DeMeulle, executive director of enterprise architecture and infrastructure, University of California San Diego. “The ability to connect real-time data from multiple clouds, data warehouses, and virtually any source through Confluent’s pre-built Apache Kafka connectors empowered us to be more agile, mobile, and prepared for the digital-first era.”
Unify All Event Data with the Industry’s Largest Ecosystem of Pre-Built Kafka Connectors
To help organizations jumpstart the development of event streaming applications, Confluent built Confluent Hub as the app store for event streaming. With a clean layout and user-friendly UI, it’s easy to browse, search, and filter over 120 connectors, including 30 that are fully managed. This is the most extensive resource for seamlessly integrating events across the organization into Kafka and is built to help quickly identify the right connector for any project. Here are the newest additions coming to Confluent Hub:
New Fully Managed Kafka Connectors to Accelerate Development of Event-Driven Serverless Applications
Serverless technologies have become mainstream as developers aim to focus more on building applications rather than managing infrastructure. Confluent has made it easier to build event-driven serverless applications with new Confluent Cloud connectors to Azure Functions Sink, Google Cloud Functions Sink, and AWS Lambda Sink, all available in preview. These pre-built, fully managed Confluent Cloud connectors elastically scale, making moving data in and out of Kafka an effortless task. Now developing event-driven serverless applications is simplified and faster than ever.
New Self-Managed Kafka Connector to Break Down Legacy System Data Silos
For many long-standing organizations, it’s difficult to get data out of on-premises legacy systems and into more modern, SaaS-based applications. Confluent has announced a new Oracle Change Data Capture (CDC) Source connector in early access to help solve this challenge. With this connector, organizations can make sure other applications are aware when data in the Oracle database has been altered or deleted. This can create more accurate real-time analytics and faster database migration. To learn more about Oracle CDC Source connector, contact our account team.
Recommended AI News: FMG Suite Announces Strategic Acquisition of Twenty Over Ten
Get the Most Out of Data with ksqlDB’s Real-Time Stream Processing
After leveraging Confluent’s connectors to democratize data, the next challenge is knowing how to filter, transform, and aggregate that data so it can be used by applications across a business. This typically requires deep Kafka expertise for each system in this process, but ksqlDB, the event streaming database purpose-built for stream processing, cuts down the steps and tools needed for stream processing. With ksqlDB pull queries now available in Confluent Cloud, it’s possible to perform point-in-time lookups on a stream of events. So applications can easily pull information like a customer’s current address or membership status and take action on that data. This paired with ksqlDB’s push queries in Confluent Cloud, enables the quickest path to building event streaming applications end to end.
Confluent’s Complete Event Streaming Platform Brings Together All Pieces Needed for Pervasive Event Streaming
Through Project Metamorphosis Confluent has built a complete event streaming platform that delivers on foundational characteristics of a cloud-native data system so that any organization can benefit from the full power of event streaming. Over this eight month initiative, we’ve laid the foundation for next-generation event streaming that is elastic, cost-effective, infinite, everywhere, global, secure, reliable, and now complete. Our work spans Confluent Platform, Confluent Cloud, and ksqlDB to deliver a complete platform for any enterprise to use event streaming for mission-critical use cases from start to finish with ease.
Recommended AI News: Apptio and Atlassian Expand Strategic Relationship
Copper scrap environmental regulations Copper wire rod recycling Scrap metal disassembly
Copper cable reprocessing, Metal waste audit, Scrap copper dealer