You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I recently published a package llm-client that can be very helpful in enabling the support to run other LLM models, including OpenAI, Google, AI21, HuggingfaceHub, Aleph Alpha, Anthropic, Local models with transformers.
I stumbled over the issue yesterday when I wanted to use promptimize for a custom model. I am not sure wether or not I complete got your question but isn't this already possible by passing a custom prompt_executor to a PromptCase?
Do you have any plans to add configuration options that would allow the use of custom LLMs in future versions ?
The text was updated successfully, but these errors were encountered: