Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

DataStax Adds Vector Search to Astra DB on Google Cloud for Building Real-time Generative AI Applications

DataStax, the real-time AI company, announced that Astra DB – its popular database-as-a-service (DBaaS) built on the open source Apache Cassandra database – now supports vector search, a key capability for letting databases provide long-term memory for AI applications using large language models (LLMs) and other AI use cases.

aws-cloud

Coming on the heels of the introduction of vector search into Cassandra, the availability of this new tool in the pay-as-you-go Astra DB service will enable developers to easily leverage the massively scalable Cassandra database for their LLM, AI assistant, and real-time generative AI projects. Goldman Sachs Research estimates that the generative AI software market could grow to $150 billion, compared to $685 billion for the global software industry.

AiThority: How Generative AI is Transforming Audio Content

DataStax is working closely with the Google Cloud AI/ML Center of Excellence as part of the Built with Google AI program to enable the best of Google Cloud’s generative AI offerings to enhance the capabilities and experience of customers using DataStax.

Vector search enables developers to search a database by context or meaning rather than keywords or literal values. This is done by using “embeddings,” for example Google Cloud’s API for text embedding, which can represent semantic concepts as vectors to search unstructured datasets such as text and images.

Embeddings are a powerful tool that enable search in natural language across a large corpus of data, in different formats, and extract the most relevant pieces of data.

Related Posts
1 of 41,006

Vector stores are required to enable extremely low latency search across databases. Altogether, embeddings, vector stores, and generative AI models like Google PaLM 2, can create powerful capabilities that dynamically combine the right information, for the right customer in their expected language. And now, because of Cassandra’s ability to search by meaning, it will play a key role in building generative AI applications.

“Vector search is a key part of the new AI stack; every developer building for AI needs to make their data easily queryable by AI agents,” said Ed Anuff, CPO, DataStax. “Unlike many other vector databases, Astra DB is not only built for global scale and availability, but supports the most stringent enterprise-level requirements for managing sensitive data including HIPAA, PCI, and PII regulations. It’s therefore an ideal option for both startups and enterprises that manage sensitive user information and want to build impactful generative AI applications.”

Read: How AI NFTs Are Unlocking the Democratization of the Digital Economy

DataStax is launching the new vector search tool and other new features via a NoSQL copilot – a Google Cloud Gen AI-powered chatbot that helps DataStax customers develop AI applications on Astra DB. DataStax and Google Cloud are releasing CassIO – an open source plugin to LangChain that makes it easy to combine Google Cloud’s Vertex AI service with Cassandra for caching, vector search, and chat history retrieval.

“Our customers consistently ask us for ways to tightly integrate data and AI capabilities,“ said Stephen Orban, VP of Migrations and GenAI Ecosystem at Google Cloud. “By integrating Google Cloud’s generative AI capabilities into Astra DB, DataStax is adding natural language capabilities into a suite of already powerful database capabilities, and giving customers a complete and unified data and AI solutions approach.”

“Priceline has been at the forefront of using machine learning for many years,” said Martin Brodbeck, CTO, Priceline. “Vector search gives us the ability to semantically query the billions of real-time signals we receive as part of our checkout experience that flow back to Astra DB. We plan to use Google Cloud’s generative AI capabilities alongside Astra DB’s vector search to power our real time data infrastructure and generative AI experiences.”

Latest Insights: Why Only AI and Data Analytics Can Stop Financial Criminals

[To share your insights with us, please write to sghosh@martechseries.com]

Comments are closed.