pytorch: grad can be implicitly created only for scalar outputs
这个错误很早就遇到过但是没看到网上叙述清楚的,这里顺便写一下。这里贴一下autograd.grad()的注释grad(outputs,inputs,grad_outputs=None,retain_graph=None,create_graph=False,only_inputs=True,allow_unused=False)Computesandreturnsthesumofgradients