tensorflow 2.0 随机梯度下降 之 求导链式法则

6.6 求导链式法则

  • 链式法则

链式法则

求导法则
在这里插入图片描述
公式
∂ y ∂ x = ∂ y ∂ u ∂ u ∂ x \frac{\partial y}{\partial x} = \frac{\partial y}{\partial u}\frac{\partial u}{\partial x} xy=uyxu

x = tf.constant(1.)
w1 = tf.constant(2.)
b1 = tf.constant(1.)
w2 = tf.constant(2.)
b2 = tf.constant(1.)


with tf.GradientTape(persistent=True) as tape:

	tape.watch([w1, b1, w2, b2])

	y1 = x * w1 + b1
	y2 = y1 * w2 + b2

dy2_dy1 = tape.gradient(y2, [y1])[0]
dy1_dw1 = tape.gradient(y1, [w1])[0]
dy2_dw1 = tape.gradient(y2, [w1])[0]


print(dy2_dy1 * dy1_dw1)   # tf.Tensor(2.0, shape=(), dtype=float32)
print(dy2_dw1)   # tf.Tensor(2.0, shape=(), dtype=float32)

你可能感兴趣的:(tensorflow,深度学习,tensorflow,tensorflow2.0,求导,链式法则)