You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I found out model is distributed loaded on each gpus while inferencing, but for each time iteration, only one data sample is being infereced. Is there anyway that we can dealing with multiple data samples at the same time?
The text was updated successfully, but these errors were encountered:
Hi there,
I found out model is distributed loaded on each gpus while inferencing, but for each time iteration, only one data sample is being infereced. Is there anyway that we can dealing with multiple data samples at the same time?
The text was updated successfully, but these errors were encountered: