stack(): functions with out=... arguments don‘t support automatic differentiation, but one of the ar

在使用多worker进行数据读取时报以下错误:

Original Traceback (most recent call last):
  File "/hpc/users/HKUST-GZ/mics/.conda/envs/event/lib/python3.8/site-packages/torch/utils/data/_utils/worker.py", line 287, in _worker_loop
    data = fetcher.fetch(index)
  File "/hpc/users/HKUST-GZ/mics/.conda/envs/event/lib/python3.8/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch
    return self.collate_fn(data)
  File "/hpc/users/HKUST-GZ/mics/smy/share/event_to_frame/utils/loader.py", line 32, in collate_events
    events = default_collate(events)
  File "/hpc/users/HKUST-GZ/mics/.conda/envs/event/lib/python3.8/site-packages/torch/utils/data/_utils/collate.py", line 138, in default_collate
    return torch.stack(batch, 0, out=out)
RuntimeError: stack(): functions with out=... arguments don't support automatic differentiation, but one of the arguments requires grad.

网上简单搜索了一下发现这个和我的问题几乎一样,根据回答来看问题应该是出在输入数据的require_grad=True,在获取数据的时候在其最后加个.detach()取消梯度即可解决问题,如events=events.detach()

你可能感兴趣的:(pytorch,深度学习,人工智能)