转:resnet,densenet的反向传播详解

来源作者:迷途的Go
来源链接:https://www.jianshu.com/p/7ee2650fe1ea

正常网络的反向传播

resnet_block.png

, , , ,

初始化网络参数,给个初始值,经过前向传播,图中x,z,h,o的值都是已知的

, o,h是已知的,, ,代入公式得

\frac {\partial L}{\partial W1}=\frac{\partial L}{\partial o} \frac{\partial o}{\partial h} \frac{\partial h}{\partial z} \frac{\partial z}{\partial x}=\frac {\partial L}{\partial o} * W2 \bigotimes \frac{\partial \phi(z)}{\partial z} x^T=2(o-y) * W2 \bigotimes \frac{\partial \phi(z)}{\partial z} x^T

resnet的反向传播

一个正常的两层网络


resnet_block.png

对于一个正常的block,

一个两层的resent block


resnet_block1.png

\frac{\partial L}{\partial l_1}=\frac{\partial L}{\partial l_2}\frac{\partial l_2}{\partial o}\frac{\partial o}{\partial l_1}=\frac{\partial L}{\partial l_2}\frac{\partial l_2}{\partial o} \frac{\partial(o_1+l_1)}{\partial l_1}=\frac{\partial L}{\partial l_2} \frac{\partial l_2}{\partial o}(W+1)

和没有残差的两层block相对比,在W的位置多出了1,即梯度不衰减的传递回去了,避免梯度消失

densenet的反向传播

两层的densenet结构如下


densenet_block.png

从L反向传播到梯度:

\frac{\partial L}{\partial l_1}=\frac{\partial L}{\partial l_2}\frac{\partial l_2}{\partial o}\frac{\partial o}{\partial l_1}=\frac{\partial L}{\partial l_2}\frac{\partial l_2}{\partial o} \frac{\partial(o_1 cat\ l_1)}{\partial l_1}=\frac{\partial L}{\partial l_2} \frac{\partial l_2}{\partial o}(W cat\ 1)

cat为运算算子,cat=concate,将两个张量按照维度拼接起来,是在W上按照的shape的1全部concate到W上

你可能感兴趣的:(深度学习——理论知识)