We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
What is the maximum temperature value that easy-llms supports?
Although OpenAI documentation suggests that gpt-4o will support 0 to 2, I found that easy-llms errors out at 1.7 and above.
I was able to successfully query with a temperature up to 2 with a direct API request to OpenAI using chat completion.
The text was updated successfully, but these errors were encountered:
@7robots I seem to be able to call it with a temperature of 1.9 and 2:
1.9
2
% cat test.py from llms.openai import * answer = gpt_4o(temperature=1.9).run("what llm are you?") print(answer) answer = gpt_4o(temperature=2).run("what llm are you?") print(answer)
This is using the latest version:
% pip list | grep easy-llms easy-llms 0.1.7
Can you try a new/clean folder + creating a new venv with the latest version (if not using already)
Sorry, something went wrong.
No branches or pull requests
What is the maximum temperature value that easy-llms supports?
Although OpenAI documentation suggests that gpt-4o will support 0 to 2, I found that easy-llms errors out at 1.7 and above.
I was able to successfully query with a temperature up to 2 with a direct API request to OpenAI using chat completion.
The text was updated successfully, but these errors were encountered: