-
|
When I'm trying to use Jan. The module I'm trying to use: My system: Edit: |
Beta Was this translation helpful? Give feedback.
Replies: 3 comments
-
|
You can try smaller model since Mixtral 7B need 40-ish GB of RAM. With 32GB you can try 34B models. |
Beta Was this translation helpful? Give feedback.
-
|
I've encountered the same issue. But I've also encountered this message: The model I'm using is also: My system is: I've tried reloading, forced reload, quit the app, and restarted computer, I will attempt to switch to a smaller model to see if that helps. |
Beta Was this translation helpful? Give feedback.
-
|
The app now handles errors better and will provide a clear error message when there's an issue with loading the model. It appears that these models are outdated, and you'll need to find a newer quantized version of them. |
Beta Was this translation helpful? Give feedback.
The app now handles errors better and will provide a clear error message when there's an issue with loading the model. It appears that these models are outdated, and you'll need to find a newer quantized version of them.