Skip to content

Commit

Permalink
set device on inference pipeline only if setter available
Browse files Browse the repository at this point in the history
  • Loading branch information
grajguru committed Nov 7, 2023
1 parent 32ea708 commit 24f9d53
Showing 1 changed file with 5 additions and 1 deletion.
6 changes: 5 additions & 1 deletion mii/legacy/models/load_models.py
Original file line number Diff line number Diff line change
Expand Up @@ -89,7 +89,11 @@ def load_models(model_config):
ds_engine.module.eval() # inference
inference_pipeline.model = ds_engine.module

if model_config.load_with_sys_mem:
# Set device property only if setter method available.
if (
model_config.load_with_sys_mem
and inference_pipeline.__class__.device.fset is not None
):
inference_pipeline.device = torch.device(f"cuda:{local_rank}")

# Free up memory used when initially loading models
Expand Down

0 comments on commit 24f9d53

Please sign in to comment.