Skip to content

GPU not being used in Jan #3651

Answered by imtuyethan
imtuyethan asked this question in Get Help
Discussion options

You must be logged in to vote

Based on the details you provided, it seems the problem may be related to the ngl (number of GPU layers) setting for the AI models you're trying to load.
The default ngl setting for many models is 33, which appears to be exceeding the 4GB VRAM capacity of your Nvidia GTX 1650 GPU. That's why you had to manually reduce the VRAM usage down to around 25GB or lower to get the models to start up and use the GPU.
The ngl setting determines how many GPU layers the model will use during inference. Higher ngl values generally require more VRAM, which is why the 20GB model you tried to load wouldn't start up on your GPU.

To resolve this issue, we recommend trying the following:

  • InThreads tab in Ja…

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by imtuyethan
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
1 participant