You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As I am looking to set up a proxy between agent server and openai server, I am going to need to pass in some metadata. So I can do some pre-checks before the request hits openai server. As of now, the current LLM interface doesn't support that. I think it'd be small changes that are useful for anyone looking to make use of metadata param from openai client.
The text was updated successfully, but these errors were encountered:
Hi team,
As I am looking to set up a proxy between agent server and openai server, I am going to need to pass in some metadata. So I can do some pre-checks before the request hits openai server. As of now, the current LLM interface doesn't support that. I think it'd be small changes that are useful for anyone looking to make use of metadata param from openai client.
The text was updated successfully, but these errors were encountered: