pytorch RuntimeError: unable to open shared memory object in read-write mode: Too many open files

RuntimeError: unable to open shared memory object in read-write mode: Too many open files (24)

网上的解决办法是通过添加下面两行解决

import torch.multiprocessing

torch.multiprocessing.set_sharing_strategy('file_system')

后来发现实际上是我们自定义了dataloader传入的collate_fn处理有问题,如果里面有把list里面又有list,我把list里面的那个list转成Tensor就会导致这个问题,原因是list里面的tensor太多了,参考解决地址torch.tensors in torch.multiprocessing · Issue #11899 · pytorch/pytorch · GitHub

你可能感兴趣的:(pytorch,人工智能,python)