-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
GPU Usage #6
Comments
To determine if Belullama is using your GPU instead of your CPU, you can follow these steps:
If you're seeing high CPU usage (51-54%) during responses, it's possible that Ollama is still using the CPU. This could be due to:
To address this:
If issues persist, you might need to review the Belullama GPU installation script or check for any error messages during the setup process. |
So I tried your commands none work, in case it doesnt even pick up that ollama is installed even though I can run it. the -smi commands all return not found. I can try reinstalling it but can you perhaps put up a better install guide for Nvidia beta test from scratch as had your original docker installed then installed the nvidia one.. |
So I gave the it a lengthy prompt to see if I can tell if its using my CPU or GPU. Now how do I tell if its using the GPU? When I look at my CPU usage while its writing the response I am seeing my CPU usage go to between 51 and 54%..
The text was updated successfully, but these errors were encountered: