Sinequa Supercharges Intelligent Search Powered by NVIDIA
Sinequa adopts NVIDIA DGX systems to accelerate and scale the development and training of its Neural Search and Large Language Models
Leading AI-powered enterprise search provider Sinequa announced it will develop powerful, enterprise-grade AI capabilities using NVIDIA accelerated computing to transform the modern workplace. The collaboration enables organizations to make full use of generative AI within an enterprise setting by pairing it with Sinequa’s Neural Search, which will fundamentally transform the way work gets done.
AiThority Interview Insights: How to Get Started with Prompt Engineering in Generative AI Projects
“Enterprises looking to us to leverage transformational capabilities of generative AI need powerful infrastructure that can tackle the unique demands of today’s most complex AI models such as LLMs”
Using NVIDIA DGX H100 systems, the world’s most advanced enterprise AI infrastructure, Sinequa will develop custom large language models (LLMs) trained to deliver unmatched AI-powered Neural Search. This will allow employees to securely find answers based on an organization’s own content, enabling faster processes and decision-making.
Sinequa will leverage the immense power of NVIDIA systems and software to develop new LLM models, improve the accuracy of existing models, and quickly deploy these models into Sinequa’s platform. Its customers will continue to benefit from the most accurate and relevant search results possible. As more leaders look to capitalize on generative AI solutions, accessing relevant, accurate, reliable and traceable information has never been more significant. The most comprehensive Neural Search with the best LLMs means customers can converse with their content for the easiest, most powerful, and most complete way to capitalize on the knowledge of their organization to drive the business forward.
Built for large-scale industrial data and content, Sinequa’s platform combines Neural Search and Microsoft Azure’s Open AI Service with any generative large language model. This expands Sinequa’s already extensive AI functionality with innovative use cases for generative AI across key enterprise verticals, including manufacturing and life sciences.
Read More about AiThority Interview: AiThority Interview with Matthias Neumayer, Co-Founder & CEO at Oscar Stories
Manufacturers can now deploy powerful AI-based search capabilities combined with automated summaries to help engineering teams and service teams find answers. The solution connects teams with information across the product lifecycle and enhances their work with fast, accurate, and unified search and insight generation across projects. Employees are also granted easy visibility into all products and parts within an organization’s design, supply chain, manufacturing, and services processes.
For leading pharmaceutical organizations, Sinequa enables accurate, fast, traceable semantic search, insight generation, and summarization. Users can query and converse with a secure corpus of data, including proprietary life science systems, enterprise collaboration systems, and external data sources, to answer complex and nuanced questions. Comprehensive search results — with best-in-class relevance and the ability to generate concise summaries — enhance R&D intelligence, optimize clinical trials, and streamline regulatory workflows.
Alexandre Bilger, Co-CEO at Sinequa, said: “As generative AI continues to gain momentum, leading organizations realize that the key to applying this technology to their enterprise data is to combine it with intelligent search. Companies that embrace this transformation early will gain a competitive edge by accelerating the application of their corporate knowledge, easily leveraging the golden information hidden in their applications. Using NVIDIA DGX H100 systems, we are able to accelerate our cycle of development to bring cutting-edge LLMs to our customers so that they can converse with their content.”
“Enterprises looking to us to leverage transformational capabilities of generative AI need powerful infrastructure that can tackle the unique demands of today’s most complex AI models such as LLMs,” said Charlie Boyle, vice president of DGX systems at NVIDIA. “NVIDIA DGX H100 systems provide Sinequa the world’s most advanced AI platform to help them develop and deploy custom LLMs, paving the way for innovation across industries.”
Latest AiThority Interview Insights : AiThority Interview with Jim Kwiatkowski, CEO at LTX, a Broadridge Company
[To share your insights with us, please write to sghosh@martechseries.com]
Comments are closed.