Tensorflow 可变学习率 learning rate

Exponential decay

#decayed_learning_rate = learning_rate *
#                       decay_rate ^ (global_step / decay_steps)
# u can use help(tf.train.exponential_decay) in python3 to see the manual of this function

global_step = tf.Variable(0)  
learning_rate = tf.train.exponential_decay(0.1, global_step, 100, 0.96, staircase=True)    
 #生成学习率  
  
learning_step = tf.train.GradientDescentOptimizer(learning_rate).minimize(.....,
                                                             global_step=global_step)  
#使用指数衰减学习率  每次 sess.run(train), the global_step will increase 1,
#You dont need change the global_step in trainng loop

Piecewise_constant decay

  global_step = tf.Variable(0, trainable=False)
  boundaries = [100000, 110000]
  values = [1.0, 0.5, 0.1]
  learning_rate = tf.train.piecewise_constant(global_step, boundaries, values)
  
  learning_step = tf.train.GradientDescentOptimizer(learning_rate).minimize(.....,
                                                             global_step=global_step)  


  # Later, whenever we perform an optimization step, we increment global_step.

你可能感兴趣的:(Tensorflow 可变学习率 learning rate)