遇到问题如下:
Traceback (most recent call last):
File "main.py", line 29, in
solver.test()
File "/home/nk/zjc/PycharmProjects/3ClassicAlgorithm/MINet/MINet(Relu)/MINet (weight)(FPNres) (selfmade-channel)(x2)/MINet-master/code/utils/solver.py", line 236, in test
results = self.__test_process(save_pre=self.save_pre)
File "/home/nk/zjc/PycharmProjects/3ClassicAlgorithm/MINet/MINet(Relu)/MINet (weight)(FPNres) (selfmade-channel)(x2)/MINet-master/code/utils/solver.py", line 274, in __test_process
outputs = self.net(in_imgs)
File "/home/nk/anaconda3/lib/python3.7/site-packages/torch/nn/modules/module.py", line 493, in __call__
result = self.forward(*input, **kwargs)
File "/home/nk/zjc/PycharmProjects/3ClassicAlgorithm/MINet/MINet(Relu)/MINet (weight)(FPNres) (selfmade-channel)(x2)/MINet-master/code/network/MINet.py", line 171, in forward
in_data_1, in_data_2, in_data_4, in_data_8, in_data_16, w0a, w1a, w2a, w3a, w4a, w0b, w1b, w2b, w3b, w4b # todo W
File "/home/nk/anaconda3/lib/python3.7/site-packages/torch/nn/modules/module.py", line 493, in __call__
result = self.forward(*input, **kwargs)
File "/home/nk/zjc/PycharmProjects/3ClassicAlgorithm/MINet/MINet(Relu)/MINet (weight)(FPNres) (selfmade-channel)(x2)/MINet-master/code/module/MyModule.py", line 468, in forward
out_xs.append(self.conv1(in_data_1, in_data_2, in_data_4, w1a, w1b)) # todo
File "/home/nk/anaconda3/lib/python3.7/site-packages/torch/nn/modules/module.py", line 493, in __call__
result = self.forward(*input, **kwargs)
File "/home/nk/zjc/PycharmProjects/3ClassicAlgorithm/MINet/MINet(Relu)/MINet (weight)(FPNres) (selfmade-channel)(x2)/MINet-master/code/module/MyModule.py", line 299, in forward
out = self.relu(self.bnm_3(self.m2m_3(m)) * wa + self.identity(in_m) * wb) # todo
RuntimeError: The size of tensor a (2) must match the size of tensor b (4) at non-singleton dimension 0
原因如下:
运行Pytorch时出现了这样一个错误,原因在于计算二分类交叉熵损失函数时是在每个batch中进行的,而总的图片数量并不能被所设置的batch_size整除,造成最后一个batch的图片数量与batch_size不相等。
因为我设置的值是4个一组,但是实际情况并不是能被4整除
https://blog.csdn.net/S20144144/article/details/100015058