Artificial Intelligence | News | Insights | AiThority
[bsfp-cryptocurrency style=”widget-18″ align=”marquee” columns=”6″ coins=”selected” coins-count=”6″ coins-selected=”BTC,ETH,XRP,LTC,EOS,ADA,XLM,NEO,LTC,EOS,XEM,DASH,USDT,BNB,QTUM,XVG,ONT,ZEC,STEEM” currency=”USD” title=”Cryptocurrency Widget” show_title=”0″ icon=”” scheme=”light” bs-show-desktop=”1″ bs-show-tablet=”1″ bs-show-phone=”1″ custom-css-class=”” custom-id=”” css=”.vc_custom_1523079266073{margin-bottom: 0px !important;padding-top: 0px !important;padding-bottom: 0px !important;}”]

Portkey Brings AI Guardrails on Top of its Open Source AI Gateway

Portkey is filling a critical gap in productionizing AI apps – by bringing Guardrails on top of the AI Gateway, AI teams can now ship to production confidently.

LLMs are brittle – not just in API uptimes or their inexplicable 400/500 errors, but also in their core behavior. You can get a response with a 200 status code that completely errors out for your app’s pipeline due to mismatched output. We’ve long thought about the problem of fixing LLM outputs at Portkey, and wondered if we could bring some of the Guardrail abstractions towards the Gateway directly. With Portkey’s Guardrails, we now complete the loop on building robust & reliable AI apps that behave EXACTLY as you want, every time.

Also Read: AI Inspired Series by AiThority.com: Featuring Bradley Jenkins, Intel’s EMEA lead for AI PC & ISV strategies

There was a crucial piece missing when it came to productionizing AI systems – orchestrating the LLM’s behavior based on guardrail verdicts on the inputs & outputs. We now solve it with this update!”— Rohit Agarwal

Related Posts
1 of 40,802

As Chip Huyen (VP of AI & OSS at Voltron Data) shared recently, Guardrails around a model Gateway solve 2 problems at once —
1. Identify/Fix faulty LLM outputs and
2. Orchestrate the request based on Guardrails results to actually make the app work.

Also Read: Extreme Networks and Intel Join Forces to Drive AI-Centric Product Innovation

We are teaming up with multiple AI Guardrail leaders in the industry to do exactly this – bring their SOTA AI Guardrails on top of Portkey’s open source AI Gateway, and make it incredibly easy for developers to use them in production. The Gateway orchestrates your LLM requests based on Guardrail results, and makes your app behave exactly as it should.

[To share your insights with us as part of editorial or sponsored content, please write to psen@itechseries.com]

Comments are closed.