Announcing AI Gateway #12569
Replies: 3 comments 9 replies
-
Hi Marco @subnetmarco , this is great. We'd be happy to contribute to this AI proxy to add support for Amazon Bedrock invoke model API in order to add it to the list of providers you support |
Beta Was this translation helpful? Give feedback.
-
A new release of AI Gateway has been announced with streaming support, new plugins, richer analytics and more: https://konghq.com/blog/product-releases/ai-gateway-goes-ga We are going to release AWS Bedrock and Gemini support in just a few weeks. The new analytics allows to extract all sort of metrics from your GenAI usage: |
Beta Was this translation helpful? Give feedback.
-
@subnetmarco I have many LLM models for ai proxy, can I directly load balance and perform health checks on multiple models? I was unable to achieve load balancing for the LLM model in my actual Kong configuration |
Beta Was this translation helpful? Give feedback.
-
Hello everyone 👋 , today we announced a set of new AI plugins in the broader Kong Gateway 3.6 release. You can read the full AI announcement here.
Essentially we can now use Kong Gateway to proxy AI traffic to cloud and self-hosted LLMs with a whole set of L7 AI Capabilities when it comes to prompts, observability and request/response transformation, and more. All other plugins can now be also used on top of upstream AI services to establish more advanced governance, security and traffic control patterns.
You can learn more about the new AI capabilities and even see demonstration videos at the official product page.
I am opening this discussion thread to collect feedback and ideas around this new AI use-case, and looking for feedback and suggestions from the community.
Update: Kong AI Gateway 3.8 has been announced with semantic caching, semantic guardrails, semantic routing and more.
Beta Was this translation helpful? Give feedback.
All reactions