参考 http://cs231n.stanford.edu/vecDerivs.pdf
如何对 y = W x y = Wx y=Wx 求导?其中:
- y : C × 1 y: {C\times1} y:C×1
- W : C × D W: {C\times D} W:C×D
- x : D × 1 x: {D\times 1} x:D×1
可以先通过计算一种特例,比如 ∂ y 7 ∂ x 3 \frac{\partial{y_7}}{\partial{x_3}} ∂x3∂y7 来更好地理解, y 7 y_7 y7 可以写成
y 7 = ∑ j = 1 D W 7 , j x j = W 7 , 1 x 1 + W 7 , 2 x 2 + W 7 , 3 x 3 + ⋯ y_7 = \sum_{j=1}^{D}W_{7,j} x_j =W_{7,1}x_1 + W_{7,2}x_2 + W_{7,3}x_3+\cdots y7=j=1∑DW7,jxj=W7,1x1+W7,2x2+W7,3x3+⋯
所以 ∂ y 7 ∂ x 3 = W 7 , 3 \frac{\partial{y_7}}{\partial{x_3}}=W_{7,3} ∂x3∂y7=W7,3。进而, ∂ y ∂ x = W \frac{\partial{y}}{\partial{x}} = W ∂x∂y=W
PS: 标量对向量求导的维度为 1 ∗ n 1*n 1∗n; 向量对标量求导的维度为 n ∗ 1 n*1 n∗1;
y = x W y = xW y=xW, 如何求 ∂ y ∂ W \frac{\partial{y}}{\partial{W}} ∂W∂y?其中:
- y : 1 × C y: {1\times C} y:1×C
- W : D × C W: {D\times C} W:D×C
- x : 1 × D x: {1\times D} x:1×D
依然先计算特例: ∂ y 3 ∂ W 78 \frac{\partial{y_3}}{\partial{W_{78}}} ∂W78∂y3, 首先
y 3 = x 1 W 13 + x 2 W 23 + ⋯ + x D W D 3 y_3 = x_1 W_{13} + x_2W_{23} + \dots + x_D W_{D3} y3=x1W13+x2W23+⋯+xDWD3
所以可以看到 ∂ y 3 ∂ W 78 = 0 \frac{\partial{y_3}}{\partial{W_{78}}}=0 ∂W78∂y3=0,进一步又发现
∂ y j ∂ W i j = x i \frac{\partial{y_j}}{\partial{W_{ij}}} = x_i ∂Wij∂yj=xi
于是令 F i , j , k = ∂ y i ∂ W j k F_{i,j,k}=\frac{\partial{y_i}}{\partial{W_{jk}}} Fi,j,k=∂Wjk∂yi,有
F i , j , i = x j F_{i,j,i} = x_j Fi,j,i=xj
张量 F F F 的其余项均为0,因此可以定义一个二维矩阵 G i , j = F i , j , i G_{i,j} = F_{i,j,i} Gi,j=Fi,j,i 来表示 ∂ y ∂ W \frac{\partial{y}}{\partial{W}} ∂W∂y的结果。
PS:Representing the important part of derivative arrays in a compact way is critical to efficient implementations of neural networks.
Y = X W Y=XW Y=XW, 如何求 ∂ Y a , b ∂ X c , d \frac{\partial{Y_{a,b}}}{\partial{X_{c,d}}} ∂Xc,d∂Ya,b?其中:
- Y : n × C Y: {n\times C} Y:n×C
- W : D × C W: {D\times C} W:D×C
- x : n × D x: {n\times D} x:n×D
依然进行展开:
Y i , j = ∑ k = 1 D X i , k W k , j Y_{i,j} = \sum_{k=1}^D X_{i,k}W_{k,j} Yi,j=k=1∑DXi,kWk,j
于是有
(1) ∂ Y i , j ∂ X i , k = W k , j \frac{\partial{Y_{i,j}}}{\partial{X_{i,k}}} = W_{k,j} \tag{1} ∂Xi,k∂Yi,j=Wk,j(1)
因此
∂ Y a , b ∂ X c , d = { W d , b , when a = c 0 , when a ≠ c \frac{\partial{Y_{a,b}}}{\partial{X_{c,d}}} = \begin{cases} W_{d,b}, \qquad \text{when}\ a=c\\ 0, \qquad \text{when}\ a\neq c \end{cases} ∂Xc,d∂Ya,b={Wd,b,when a=c0,when a̸=c
可以发现
y = V m y=Vm y=Vm, 其中 m = W x m=Wx m=Wx, 求 ∂ y ∂ x \frac{\partial{y}}{\partial{x}} ∂x∂y?
依然先从特例开始:
∂ y i ∂ x j = ∂ y i ∂ m ∂ m ∂ x j = ∑ k = 1 M ∂ y i ∂ m k ∂ m k ∂ x j = ∑ k = 1 M V i , k W k , j = V i , : W : , j \begin{aligned} \frac{\partial{y_i}}{\partial{x_j}} &= \frac{\partial{y_i}}{\partial{m}}\frac{\partial{m}}{\partial{x_j}} \\ &= \sum_{k=1}^M \frac{\partial{y_i}}{\partial{m_k}}\frac{\partial{m_k}}{\partial{x_j}} \\ &= \sum_{k=1}^M V_{i,k}W_{k,j} \\ &= V_{i,:}W_{:,j} \end{aligned} ∂xj∂yi=∂m∂yi∂xj∂m=k=1∑M∂mk∂yi∂xj∂mk=k=1∑MVi,kWk,j=Vi,:W:,j
因此
∂ y ∂ x = V W \frac{\partial{y}}{\partial{x}} = VW ∂x∂y=VW