Pythorch torch.nn 与 torch.nn.functional的区别

我们知道在torch.nntorch.nn.functional中有许多类似的激活函数或者损失函数,例如:

torch.nn.ReLUtorch.nn.functional.relu

torch.nn.MSElosstorch.nn.functional.mse_loss

  • 那么他们有什么区别呢?

1.在定义函数层(继承nn.Module)时,init函数中应该用torch.nn,例如torch.nn.ReLU,torch.nn.Dropout2d,而forward中应该用torch.nn.functionl,例如torch.nn.functional.relu,不过请注意,init里面定义的是标准的网络层。只有torch.nn定义的才会进行训练。torch.nn.functional定义的需要自己手动设置参数。所以通常,激活函数或者卷积之类的都用torch.nn定义。
2.torch.nn是类,必须要先在init中实例化,然后在forward中使用,而torch.nn.functional可以直接在forward中使用。

参考

1.How to choose between torch.nn.Functional and torch.nn module?
2.What’s the difference between torch.nn.functional and torch.nn?
3.What is the difference between torch.nn and torch.nn.functional?
4.Beginner: Should ReLU/sigmoid be called in the init method?
5.PyTorch 中,nn 与 nn.functional 有什么区别?

你可能感兴趣的:(Pytorch,Pytorch,torch.nn)