Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Customs LLMs #2

Open
Pierrelefort opened this issue Apr 25, 2023 · 2 comments
Open

Customs LLMs #2

Pierrelefort opened this issue Apr 25, 2023 · 2 comments

Comments

@Pierrelefort
Copy link

Do you have any plans to add configuration options that would allow the use of custom LLMs in future versions ?

@uripeled2
Copy link

I recently published a package llm-client that can be very helpful in enabling the support to run other LLM models, including OpenAI, Google, AI21, HuggingfaceHub, Aleph Alpha, Anthropic, Local models with transformers.

@steffenslavetinsky
Copy link

I stumbled over the issue yesterday when I wanted to use promptimize for a custom model. I am not sure wether or not I complete got your question but isn't this already possible by passing a custom prompt_executor to a PromptCase?

even a simple lambda function instead of an LLM

from promptimize.prompt_cases import PromptCase
from promptimize import evals

simple_prompts = [
    PromptCase(
        "hello there!",
        lambda x: evals.any_word(x.response, ["hi", "hello"]),
        prompt_executor=lambda _x: "hi there!",
    ),
    PromptCase(
        lambda x: evals.all_words(x.response, ["zappa", "hendrix"]),
        weight=2,
        prompt_executor=lambda _x: "zappa, hendrix, clapton, page, beck, ...",
    ),
]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants