GPU not being used in Jan #3651
Answered
by
imtuyethan
imtuyethan
asked this question in
Get Help
-
I'm having some trouble with my Nvidia GPU not being utilized properly in the Jan app. jan.mp4 |
Beta Was this translation helpful? Give feedback.
Answered by
imtuyethan
Sep 13, 2024
Replies: 1 comment
-
Beta Was this translation helpful? Give feedback.
0 replies
Answer selected by
imtuyethan
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Based on the details you provided, it seems the problem may be related to the ngl (number of GPU layers) setting for the AI models you're trying to load.
The default ngl setting for many models is 33, which appears to be exceeding the 4GB VRAM capacity of your Nvidia GTX 1650 GPU. That's why you had to manually reduce the VRAM usage down to around 25GB or lower to get the models to start up and use the GPU.
The
ngl
setting determines how many GPU layers the model will use during inference. Higher ngl values generally require more VRAM, which is why the 20GB model you tried to load wouldn't start up on your GPU.To resolve this issue, we recommend trying the following:
Threads
tab in Ja…