tensorflow 实现一个Softmax Regression

import tensorflow as tf

sess = tf.InteractiveSession()
x = tf.placeholder(tf.float32, [None,784])
W = tf.Variable(tf.zeros([784,10]))
b = tf.Variable(tf.zeros([10]))

y = tf.nn.softmax(tf.matmul(x,W) + b)

定义cross-entropy(损失函数)

y_ = tf.placeholder(tf.float32,[None,10])
# loss function
cross_entropy = tf.reduce_mean(-tf.reduce_sum(y_ * tf.log(y),reduction_indices=[1]))

优化算法

# 定义优化算法-随机梯度下降SGD(Stochastic Gradient Descent)
# 重复进行反向传播(Back Propagation)和梯度下降
train_step = tf.train.GradientDescentOptimizer(0.5).minimize(cross_entropy)
# 全局参数初始化器
tf.global_variables_initializer().run()

你可能感兴趣的:(tensorflow 实现一个Softmax Regression)