Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

Arize AI, LlamaIndex Roll Out Joint Platform for Evaluating LLM Applications

Strategic alliance and joint product promises to broaden the adoption of generative AI across industries

Arize AI (PRNewsfoto/Arize AI)

Arize AI, a pioneer and leader in AI observability and LLM evaluation, and LlamaIndex, a leading data framework for LLM applications, debuted a new joint offering today called LlamaTrace, a hosted version of Arize OSS Phoenix.

According to a soon-to-release survey, 47.7% of AI engineers and developers building generative AI applications are leveraging retrieval today in their LLM Applications. By connecting data to generative AI, orchestration frameworks like LlamaIndex can be game-changers in accelerating generative AI development. However, for many teams and enterprises technical challenges remain in getting modern LLM systems – with layers of abstraction – ready for the real world.

Also Read: AMD to Acquire Silo AI to Expand Enterprise AI Solutions Globally

To help, Arize and LlamaIndex are debuting an LLM tracing and observability platform that works natively with the LlamaIndex and Arize ecosystem. With a foundation based on Arize Phoenix OSS, the hosted version of Phoenix offers the ability to persist application telemetry data generated during AI development in order to better experiment, iterate, and collaborate in development or production.

Related Posts
1 of 41,013

The solution has a foundation in open source and features a fully hosted, online, persistent deployment option for teams that do not want to self host. AI engineers can instantly log traces, persist datasets, run experiments, run evaluations – and share those insights with colleagues.

Also Read: Security Compass Expands Industry Leading AI Security Content and Introduces AI-Powered Navigator Beta

The new offering is available today, and can be accessed through either a LlamaIndex or Arize account.

“We share a vision with LlamaIndex in enabling builders to reduce the time it takes to deploy generative AI into production but in a way that is super battle hardened for business-critical use cases,” said Jason Lopatecki, CEO and Co-Founder of Arize. “As leaders in our respective spaces with a common philosophy in empowering AI engineers and developers, we’re uniquely positioned here to do something that can move modern LLMOps forward and broaden adoption.”

“Prototyping a RAG pipeline or agent is easy, but every AI engineer needs the right data processing layer, orchestration framework, and experimentation/monitoring tool in order to take these applications to production. LlamaTrace by Arize offers the richest toolkit we’ve seen in enabling developers to observe, debug, and evaluate every granular step of a very complex LLM workflow, and it nicely complements the production-ready data platform and orchestration framework that LlamaCloud and LlamaIndex offer,” said Jerry Liu, CEO of LlamaIndex

Also Read: Survey Reveals Only 20 Percent of Senior IT Leaders Are Using Generative AI in Production

[To share your insights with us as part of editorial or sponsored content, please write to psen@itechseries.com]

Comments are closed.