RuntimeError: Trying to backward through the graph a second time

RuntimeError: Trying to backward through the graph a second time (or directly access saved tensors after they have already been freed). Saved intermediate values of the graph are freed when you call .backward() or autograd.grad(). Specify retain_graph=True if you need to backward through the graph a second time or if you need to access saved tensors after calling backward.

其中一种思路:检查是否存在计算图的leaf被重新输入新的网络进行跟踪梯度,诸如此类,解决办法参考:

out = model_1(inputs=inputs_1)
out_2 = model_2(inputs=inputs_2, labels=out.detach())
loss = loss_fn(out_2, gts)

你可能感兴趣的:(Error分享,bug,pytorch)