How to apply gradient clipping in TensorFlow?

Gradient clipping needs to happen after computing the gradients, but before applying them to update the model's parameters. In your example, both of those things are handled by the AdamOptimizer.minimize() method.

In order to clip your gradients you'll need to explicitly compute, clip, and apply them as described in this section in TensorFlow's API documentation. Specifically you'll need to substitute the call to the minimize() method with something like the following:

    optimizer = tf.train.AdamOptimizer(learning_rate=learning_rate)
    gvs = optimizer.compute_gradients(cost)
    capped_gvs = [(tf.clip_by_value(grad, -1., 1.), var)
                  for grad, var in gvs]
    train_op = optimizer.apply_gradients(capped_gvs)

转自:# How to apply gradient clipping in TensorFlow?

你可能感兴趣的:(How to apply gradient clipping in TensorFlow?)