You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I wanted to test this extension on my local, however, I encountered the following error!
I also test with the different current pages like google, bing, github, etc.
Error:
This model's maximum context length is 128000 tokens. However, your messages resulted in 133758 tokens. Please reduce the length of the messages.
The text was updated successfully, but these errors were encountered:
I also checked the inspect->console and I saw the following error as well:
Refused to set unsafe header "User-Agent"
Then I came up with the issue via the following code changes:
// const openai = new OpenAIApi(// new Configuration({// apiKey: key,// })// );// This hardcodes insertion of 'User-Agent'letconfig=newConfiguration({apiKey: key,});deleteconfig.baseOptions.headers['User-Agent'];constopenai=newOpenAIApi(config);
But still, the maximum token issue exists from openai API response with status 400!
I wanted to test this extension on my local, however, I encountered the following error!
I also test with the different current pages like google, bing, github, etc.
The text was updated successfully, but these errors were encountered: