tensorboard显示loss

在mnist数据集中

#添加节点
tf.summary.scalar('loss', loss)



#汇总记录节点
merge = tf.summary.merge_all()


#开启会话
with tf.Session() as sess:
    sess.run(init)
    #文件保存位置
    summary_writer = tf.summary.FileWriter('/home/penelope/workspace/python/cnn/learning/sigai/mnist/log', graph=tf.get_default_graph())

    for epoch in range (20):
        for batch in range (batchSize):
            batch_x,batch_y=mnist.train.next_batch(batchSize)
            sess.run(optimizer,feed_dict={x:batch_x,y:batch_y,keepProb:0.1})
            # 将所有日志写入文件
            if batch == 0:
                summary = sess.run(merge, feed_dict={x:batch_x,y:batch_y})
                summary_writer.add_summary(summary, epoch *batchSize)


    training_cost = sess.run(loss,feed_dict={x:mnist.test.images,y:mnist.test.labels})



将terminal切换到summary_writer目录下(log目录的上一级目录),切换到tensorflow环境,输入

tensorbrd --logdir log

log是存放文件的目录

得到一个网址,在浏览器打开即可

你可能感兴趣的:(python算法教程习题)