You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, thanks for your wonderful work. Here's one question, I have installed using pip approach to do inference using muti gpu devices, your interence.py code shows this code will run using gpu device following certain rank one by one without using something like DataParallel, does that mean I can not distribute this on multi-device so far to infer simultaneously? Thanks a lot!
The text was updated successfully, but these errors were encountered:
Hi @Lin-zeng Currently, you can divide the total task and assign different gpu to run different subtasks. Because we are currently compatible with command line inference, for simplicity, multi-device simultaneous inference can be encapsulated by the user.
Hi, thanks for your wonderful work. Here's one question, I have installed using pip approach to do inference using muti gpu devices, your interence.py code shows this code will run using gpu device following certain rank one by one without using something like DataParallel, does that mean I can not distribute this on multi-device so far to infer simultaneously? Thanks a lot!
The text was updated successfully, but these errors were encountered: