torch 模型搭建部分报错 NotImplementedError

大概是这么个报错法,用torch这么些年,第一次碰见这个错NotImplementedError
我用的也不是什么 nightly 版本啊

Traceback (most recent call last):

  File "xxxxx\x.py", line 268, in <module>
    print(x(y).shape)

  File "xxxxx\lib\site-packages\torch\nn\modules\module.py", line 889, in _call_impl
    result = self.forward(*input, **kwargs)

  File "xxxxx\x.py", line 259, in forward
    x = self.features(x)

  File "xxxxx\lib\site-packages\torch\nn\modules\module.py", line 889, in _call_impl
    result = self.forward(*input, **kwargs)

  File "xxxxx\lib\site-packages\torch\nn\modules\container.py", line 119, in forward
    input = module(input)

  File "xxxxx\lib\site-packages\torch\nn\modules\module.py", line 889, in _call_impl
    result = self.forward(*input, **kwargs)

  File "xxxxx\lib\site-packages\torch\nn\modules\module.py", line 201, in _forward_unimplemented
    raise NotImplementedError

NotImplementedError

_call_impl中,调用 self.forward

result = self.forward(*input, **kwargs)

如果继承了nn.Module后,如果你没有自己实现self.forward,则会

raise NotImplementedError

原来,我在些这个函数的时候,确实没有forward 方法hhh:

class Hswish(nn.Module):

    def __init__(self, inplace=True):
        super(Hswish, self).__init__()
        self.inplace = inplace

    def __swish(self, x, beta, inplace=True):
        # 但是这个 swish 并未被 H-swish 所使用
        # 之所以叫 H-swish 是为了让 sigmoid 变 hard
        # 用 Relu6(x+3)/6 去近似
        # 为了在嵌入式上部署,减小了计算量
        return x * F.sigmoid(beta * x, inplace)

    @staticmethod
    def Hsigmoid(x, inplace=True):
        return F.relu6(x + 3, inplace=inplace) / 6

    def foward(self, x):
        return x * self.Hsigmoid(x, self.inplace)

forward 写成了 foward

你可能感兴趣的:(每日一氵,pytorch,python,深度学习)