Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Running models on CPU. #1

Open
zianttt opened this issue Oct 15, 2024 · 1 comment
Open

Running models on CPU. #1

zianttt opened this issue Oct 15, 2024 · 1 comment

Comments

@zianttt
Copy link

zianttt commented Oct 15, 2024

Hi, I am exploring to run smaller LLMs on CPU only. Is there a way to support CPU only mode?

@huseinzol05
Copy link
Member

huseinzol05 commented Oct 27, 2024

https://github.com/mesolitica/transformers-openai-api/blob/master/transformers_openai/main_cuda.py should straight forward, if cannot detect GPU or user defined device, do not use cuda device, feel free to PR

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants