Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

Decodable Launches Delta Lake Connector Into GA; New Classes of Databricks Use Cases Now Unlocked Via Simple, Affordable Access to AI-Enhanced Analytics

Decodable, the real-time data engineering company, unveiled a faster, simpler and less expensive means to ingest streaming data into Databricks. The announcement, made possible by the addition of a new connector for users of the Decodable SaaS offering, was made today at the Data + AI Summit in San Francisco.

“Ingesting streaming data into Databricks unlocks a host of powerful analytics capabilities,” said Eric Sammer, CEO and founder of Decodable. “Unfortunately, for many developers of real-time applications, the present process to make this happen is overly complicated and cost prohibitive. We built the Delta Lake connector to solve those problems and bring the power of Databricks to a whole new set of use cases that presently are blocked due to the cost and complexity of ingestion.”

Recommended AI News: AB Tasty and Mixpanel Announce Two New Integrations to Accelerate Digital Product Innovation

Delta Lake is an open source storage framework that can be used to work with data from Amazon S3, Google Cloud, Azure Data Lake, Hadoop and numerous other data lakes.

The new Delta Lake connector is available to any Decodable user who wants to use Databricks with data in other systems. It enables ingestion of data into Databricks at the Bronze and Silver stages of the Databricks medallion data layer architecture. Using the Decodable service with the new connector is the simplest way to get data into Databricks so that application developers and data engineers can leverage the AI-driven power of Databricks, but do so via a data ingestion pipeline that is less expensive, less complex and simpler than present batch ingestion.

Recommended AI News: Darktrace Adds Early Warning System to Antigena Email

Related Posts
1 of 40,833

Getting Started With Decodable and the Delta Lake Connector

Interested application developers can get started building their first Decodable pipeline with a free Decodable Developer account, which includes the Databricks Delta Lake connector.

Decodable is a real-time data engineering service for developers who need to move data between data stores, analytic databases or messaging systems. Decodable is a fully managed service with no infrastructure for users to manage and no batching that must be configured and monitored. This allows users to focus on the data and the application, not the infrastructure.

For Databricks users, Decodable provides simple, reliable data ingestion from any source on any cloud. Decodable’s on-the-wire processing uses SQL to transform and optimize data records before they arrive at Databricks, reducing storage and processing costs and improving data quality in the Bronze and Silver tiers of the Medallion architecture.

Using simple abstractions, Decodable users can rapidly construct SQL pipelines to perform simple or compound transformations including filtering, masking, routing, aggregation, triggering and more complex windowing functions. Using Decodable to prepare data for Databricks can significantly reduce data volumes, lowering compute and storage costs at the destination.

Recommended AI News: Pega Launches Pega Process Extender for Salesforce Lightning on Salesforce App Exchange and Pega Marketplace

[To share your insights with us, please write to sghosh@martechseries.com]

Comments are closed.