Translations:PyTorch/306/en
Jump to navigation
Jump to search
There is a known issue with our PyTorch 1.10 wheel torch-1.10.0+computecanada
. Multi-GPU code that uses DistributedDataParallel running with this PyTorch version may fail unpredictably if the backend is set to 'nccl'
or 'gloo'
. We recommend using our latest PyTorch build instead of version 1.10 on all GP clusters.