You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When I use torch.utils.data.DataLoader init with multi worker and also using Cutout for transform, I observed same transformed images were generated. This number of images are same as number of workers. I guess this is because use of numpy.random in Cutout source code. In torchvision.transforms, numpy seems to be initialized by the same seed in each workers.
This issue can be fixed by replacing numpy.random with torch.randint in Cutout source code.
The text was updated successfully, but these errors were encountered:
When I use torch.utils.data.DataLoader init with multi worker and also using Cutout for transform, I observed same transformed images were generated. This number of images are same as number of workers. I guess this is because use of numpy.random in Cutout source code. In torchvision.transforms, numpy seems to be initialized by the same seed in each workers.
This issue can be fixed by replacing numpy.random with torch.randint in Cutout source code.
The text was updated successfully, but these errors were encountered: