-
Notifications
You must be signed in to change notification settings - Fork 406
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🐛 [Bug]: New install - response keeps repeating the last line #1182
Comments
Hello, I have the same bug when using Mistral or Mixtral for text generation. It keeps repeating the last sentance over and over till I restart the container. I tried increasing the repeat penalty but it does nothing. |
I've noticed this for most, if not all models I can test. This bug essentially makes serge useless. |
This is probably a bug in llama-cpp-python. I will update it this week and do a new release. Which specific model are you all using? @SolutionsKrezus @fishscene |
I'm currently using Mistral 7B and Mixtral @gaby |
Apologies, I’m at work at the moment. Off the top of my head: I would see random replies marked/flagged as code snippets… and if the model started repeating itself, that was the end of anything useful as all subsequent replies would only repeat. Of all the testing I did, getting 10 coherent replies was a major milestone- and even then, sometimes it took multiple re-prompting (delete my query and ask it slightly differently) to get to 10. A couple models started spewing nonsense and repeats on the very first response. All this to say, testing should be very easy to do. Curious though. OP is using Ryzen- so am I: Ryzen 1700x, 32GB RAM, no CUDA GPU. (NVIDIA T400 I think). Using CPU for AI. Maybe this is isolated to Ryzen CPU’s? Another behavior to note: |
I don't think it is a Ryzen-related issue @fishscene |
Same issue here. This pretty much renders the software completely useless :( |
Can you try |
Bug description
I just pulled the image, spun up a container with default settings. I downloaded the Mistral-7B model, and left everything default. I've tried a few short questions, and the answer repeats the last line until I stop the container.
Steps to reproduce
Environment Information
Docker version: 25.0.3
OS: Ubuntu 22.04.4 LTS on kernel 5.15.0-97
CPU: AMD Ryzen 5 2400G
Broswer: Firefox version 123.0
Screenshots
Relevant log output
Confirmations
The text was updated successfully, but these errors were encountered: