Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to run inference on multi GPUs #57

Open
GallonDeng opened this issue Aug 31, 2024 · 1 comment
Open

how to run inference on multi GPUs #57

GallonDeng opened this issue Aug 31, 2024 · 1 comment
Assignees

Comments

@GallonDeng
Copy link

how to run inference on multi GPUs, such as RTX4090, since it needs much more 24G?

@rob-hen
Copy link
Collaborator

rob-hen commented Aug 31, 2024

HI @AllenDun, thank you for your interest in our project.

There is currently no multi GPU implementation. We are working on reducing the memory requirements.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants