神经网络 Python TensorFlow 设置学习率(学习笔记)

通过设置学习率,既可以加快训练初期的训练速度,同时在训练后期又不会出现损失函数在极小值周围徘徊往返的情况。

#学习率设为1
import tensorflow as tf 

training_steps=10
learning_rate=1

x=tf.Variable(tf.constant(5,dtype=tf.float32),name='x')
y=tf.square(x)

train_op=tf.train.GradientDescentOptimizer(learning_rate).minimize(y)

with tf.Session() as sess:
    init_op=tf.global_variables_initializer()
    sess.run(init_op)
    for i in range(training_steps):
        sess.run(train_op)
        x_value=sess.run(x)
        print("After %s iters:x%s is %f."%(i+1,i+1,x_value))

After 1 iters:x1 is -5.000000.
After 2 iters:x2 is 5.000000.
After 3 iters:x3 is -5.000000.
After 4 iters:x4 is 5.000000.
After 5 iters:x5 is -5.000000.
After 6 iters:x6 is 5.000000.
After 7 iters:x7 is -5.000000.
After 8 iters:x8 is 5.000000.
After 9 iters:x9 is -5.000000.
After 10 iters:x10 is 5.000000.

#学习率设为0.001
import tensorflow as tf 

training_steps=1000
learning_rate=0.001

x=tf.Variable(tf.constant(5,dtype=tf.float32),name='x')
y=tf.square(x)

train_op=tf.train.GradientDescentOptimizer(learning_rate).minimize(y)

with tf.Session() as sess:
    init_op=tf.global_variables_initializer()
    sess.run(init_op)
    for i in range(training_steps):
        sess.run(train_op)
        if i%100==0:
            x_value=sess.run(x)
            print("After %s iters:x%s is %f."%(i+1,i+1,x_value))

After 1 iters:x1 is 4.990000.
After 101 iters:x101 is 4.084646.
After 201 iters:x201 is 3.343555.
After 301 iters:x301 is 2.736923.
After 401 iters:x401 is 2.240355.
After 501 iters:x501 is 1.833880.
After 601 iters:x601 is 1.501153.
After 701 iters:x701 is 1.228794.
After 801 iters:x801 is 1.005850.
After 901 iters:x901 is 0.823355.

#学习率为衰减指数
import tensorflow as tf
training_steps=100
global_step=tf.Variable(0)
learning_rate=tf.train.exponential_decay(0.1, global_step, 1, 0.96, staircase=True)

x=tf.Variable(tf.constant(5, dtype=tf.float32), name="x")
y=tf.square(x)
train_op=tf.train.GradientDescentOptimizer(learning_rate).minimize(y, global_step=global_step)

with tf.Session() as sess:
    init_op=tf.global_variables_initializer()
    sess.run(init_op)
    for i in range(training_steps):
        sess.run(train_op)
        if i%10==0:
            learning_rate_value=sess.run(learning_rate)
            x_value=sess.run(x)
            print("After %s iters:x%s is %f,learning rate is %f."%(i+1,i+1,x_value,learning_rate_value))

After 1 iters:x1 is 4.000000,learning rate is 0.096000.
After 11 iters:x11 is 0.690561,learning rate is 0.063824.
After 21 iters:x21 is 0.222583,learning rate is 0.042432.
After 31 iters:x31 is 0.106405,learning rate is 0.028210.
After 41 iters:x41 is 0.065548,learning rate is 0.018755.
After 51 iters:x51 is 0.047625,learning rate is 0.012469.
After 61 iters:x61 is 0.038558,learning rate is 0.008290.
After 71 iters:x71 is 0.033523,learning rate is 0.005511.
After 81 iters:x81 is 0.030553,learning rate is 0.003664.
After 91 iters:x91 is 0.028727,learning rate is 0.002436.

你可能感兴趣的:(神经网络 Python TensorFlow 设置学习率(学习笔记))