单层NN中使用adam算法

x = tf.placeholder(tf.float32, [None, 784])

W = tf.Variable(tf.zeros([784, 10]))

b = tf.Variable(tf.zeros([10]))

y = tf.matmul(x, W) + b

y_ = tf.placeholder(tf.float32, [None, 10])

cross_entropy = tf.reduce_mean(         tf.nn.softmax_cross_entropy_with_logits(labels=y_, logits=y)) train_step =         tf.train.GradientDescentOptimizer(0.5).minimize(cross_entropy) sess = tf.InteractiveSession() tf.global_variables_initializer().run() # Train for _ in range(1000):

        batch_xs, batch_ys = mnist.train.next_batch(100)

        _,cost = sess.run([train_step,cross_entropy], feed_dict={x:                 batch_xs, y_: batch_ys})

         print(cost)

输出为:

2.30259

1.96883

1.76687

1.39246

1.22997

1.19243

1.08657

1.12234

1.30965

0.833457

0.963737

0.767362

0.763479

0.75128

0.709319

使用adam算法:

tf.train.AdamOptimizer(0.5).minimize(cross_entropy)

输出为:

2.30259

16.8401

17.2679

16.3104

14.2131

16.3672

18.8276

11.4264

15.4074

8.97122

3.11424

5.39649

5.58269

7.20344

7.48695

11.5576

8.18053

4.38636

8.00477

3.82183

10.1561

4.68156

5.63022

3.41254

6.28483

需要在之后思考解答?

你可能感兴趣的:(单层NN中使用adam算法)