AssertionError: Default process group is not initialized

出现上述问题,是因为我是单卡,双卡的分布形式就不会出现该问题,为此,只需要tool/train.py开头加入如下语句即可:

import torch.distributed as dist
dist.init_process_group('gloo', init_method='file:///tmp/somefile', rank=0, world_size=1)

你可能感兴趣的:(pytorch,python,pytorch,numpy,python)