官网实例详解4.25(mnist_denoising_autoencoder.py)-keras学习笔记四

基于MNIST数据集训练去噪自编码器

 

Keras实例目录

效果展示

官网实例详解4.25(mnist_denoising_autoencoder.py)-keras学习笔记四_第1张图片

代码注释

'''Trains a denoising autoencoder on MNIST dataset.
基于MNIST数据集训练去噪自编码器
Denoising is one of the classic applications of autoencoders.
去噪是自编码器的一种应用
The denoising process removes unwanted noise that corrupted the
true signal.
去噪过程是清除破坏真实信号的噪音
Noise + Data ---> Denoising Autoencoder ---> Data

Given a training dataset of corrupted data as input and
true signal as output, a denoising autoencoder can recover the
hidden structure to generate clean data.
给定一个损坏数据作为输入和真实信号作为输出的训练数据集,去噪自编码器可以恢复隐藏结构以生成干净数据

This example has modular design. The encoder, decoder and autoencoder
are 3 models that share weights. For example, after training the
autoencoder, the encoder can be used to  generate latent vectors
of input data for low-dim visualization like PCA or TSNE.
这个例子具有模块化设计。编码器、解码器和自编码器是3个共享权重的模型。例如,在训练自编码器之后,编
码器可用于产生输入数据的潜向量,以用于降维可视化,如PCA或TSNE。
PCA(Principal Component Analysis,主成分分析)不仅仅是对高维数据进行降维,更重要的是经过降维去除了噪声,发现了数据中的模式
TSNE(t-distributed stochastic neighbor embedding,t-SNE)是用于降维的一种机器学习算法,是由 Laurens van der Maaten
和 Geoffrey Hinton在08年提出来。此外,t-SNE 是一种非线性降维算法,非常适用于高维数据降维到2维或者3维,进行可视化。
t-SNE是由SNE(Stochastic Neighbor Embedding, SNE; Hinton and Roweis, 2002)发展而来。我们先介绍SNE的基本原理,之后再扩
展到t-SNE。最后再看一下t-SNE的实现以及一些优化。
'''

from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
import keras
from keras.layers import Activation, Dense, Input
from keras.layers import Conv2D, Flatten
from keras.layers import Reshape, Conv2DTranspose
from keras.models import Model
from keras import backend as K
from keras.datasets import mnist
import numpy as np
import matplotlib.pyplot as plt
from PIL import Image

np.random.seed(1337)

# MNIST dataset
# MNIST数据集
(x_train, _), (x_test, _) = mnist.load_data()

image_size = x_train.shape[1]
x_train = np.reshape(x_train, [-1, image_size, image_size, 1])
x_test = np.reshape(x_test, [-1, image_size, image_size, 1])
x_train = x_train.astype('float32') / 255
x_test = x_test.astype('float32') / 255

# Generate corrupted MNIST images by adding noise with normal dist
# centered at 0.5 and std=0.5
# 添加以0.5和STD=0.5为中心的正常DIST来产生损坏的MNIST图像。
# normal dist(normal distribution),正态分布
noise = np.random.normal(loc=0.5, scale=0.5, size=x_train.shape)
x_train_noisy = x_train + noise
noise = np.random.normal(loc=0.5, scale=0.5, size=x_test.shape)
x_test_noisy = x_test + noise

x_train_noisy = np.clip(x_train_noisy, 0., 1.)
x_test_noisy = np.clip(x_test_noisy, 0., 1.)

# Network parameters
# 网络参数
input_shape = (image_size, image_size, 1)
batch_size = 128
kernel_size = 3
latent_dim = 16
# Encoder/Decoder number of CNN layers and filters per layer
# CNN层和每层滤波器的编码器/解码器数
layer_filters = [32, 64]

# Build the Autoencoder Model
# 建立自编码器模型
# First build the Encoder Model
# 首先建立编码器模型
inputs = Input(shape=input_shape, name='encoder_input')
x = inputs
# Stack of Conv2D blocks
# Conv2D块堆栈
# Notes:
# 1) Use Batch Normalization before ReLU on deep networks
# 2) Use MaxPooling2D as alternative to strides>1
# - faster but not as good as strides>1
#  注意:
# 1) 在深度网络,在ReLU(激活函数)前使用批次归一化
# 2) 使用MaxPooling2D(池化)作为strides>1替换(较快),但是不如strides>1


for filters in layer_filters:
    x = Conv2D(filters=filters,
               kernel_size=kernel_size,
               strides=2,
               activation='relu',
               padding='same')(x)

# Shape info needed to build Decoder Model
# 建立解码器模型所需的形状信息
shape = K.int_shape(x)

# Generate the latent vector
# 生成向量
x = Flatten()(x)
latent = Dense(latent_dim, name='latent_vector')(x)

# Instantiate Encoder Model
# 实例化编码器模型
encoder = Model(inputs, latent, name='encoder')
encoder.summary()

# Build the Decoder Model
# 建立解码器模型
latent_inputs = Input(shape=(latent_dim,), name='decoder_input')
x = Dense(shape[1] * shape[2] * shape[3])(latent_inputs)
x = Reshape((shape[1], shape[2], shape[3]))(x)

# Stack of Transposed Conv2D blocks
# 转换Conv2D块堆栈
# Notes:
# 1) Use Batch Normalization before ReLU on deep networks
# 2) Use UpSampling2D as alternative to strides>1
# - faster but not as good as strides>1
# 1) 在深度网络,在ReLU(激活函数)前使用批次归一化
# 2) 使用UpSampling2D(上采样、扩维)作为strides>1替换(较快),但是不如strides>1
for filters in layer_filters[::-1]:
    x = Conv2DTranspose(filters=filters,
                        kernel_size=kernel_size,
                        strides=2,
                        activation='relu',
                        padding='same')(x)

x = Conv2DTranspose(filters=1,
                    kernel_size=kernel_size,
                    padding='same')(x)

outputs = Activation('sigmoid', name='decoder_output')(x)

# Instantiate Decoder Model
# 初始化解码器模型
decoder = Model(latent_inputs, outputs, name='decoder')
decoder.summary()

# Autoencoder = Encoder + Decoder
# Instantiate Autoencoder Model
# 初始化自编码器模型
autoencoder = Model(inputs, decoder(encoder(inputs)), name='autoencoder')
autoencoder.summary()

autoencoder.compile(loss='mse', optimizer='adam')

# Train the autoencoder
# 训练自编码器
autoencoder.fit(x_train_noisy,
                x_train,
                validation_data=(x_test_noisy, x_test),
                epochs=30,
                batch_size=batch_size)

# Predict the Autoencoder output from corrupted test images
# 从损坏的测试图像预测自动编码器输出
x_decoded = autoencoder.predict(x_test_noisy)

# Display the 1st 8 corrupted and denoised images
# 显示图片
rows, cols = 10, 30
num = rows * cols
imgs = np.concatenate([x_test[:num], x_test_noisy[:num], x_decoded[:num]])
imgs = imgs.reshape((rows * 3, cols, image_size, image_size))
imgs = np.vstack(np.split(imgs, rows, axis=1))
imgs = imgs.reshape((rows * 3, -1, image_size, image_size))
imgs = np.vstack([np.hstack(i) for i in imgs])
imgs = (imgs * 255).astype(np.uint8)
plt.figure()
plt.axis('off')
plt.title('Original images: top rows, '
          'Corrupted Input: middle rows, '
          'Denoised Input:  third rows')
plt.imshow(imgs, interpolation='none', cmap='gray')
Image.fromarray(imgs).save('corrupted_and_denoised.png')
plt.show()

代码执行

 

C:\ProgramData\Anaconda3\python.exe E:/keras-master/examples/mnist_denoising_autoencoder.py

Using TensorFlow backend.
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
encoder_input (InputLayer)   (None, 28, 28, 1)         0         
_________________________________________________________________
conv2d_1 (Conv2D)            (None, 14, 14, 32)        320       
_________________________________________________________________
conv2d_2 (Conv2D)            (None, 7, 7, 64)          18496     
_________________________________________________________________
flatten_1 (Flatten)          (None, 3136)              0         
_________________________________________________________________
latent_vector (Dense)        (None, 16)                50192     
=================================================================
Total params: 69,008
Trainable params: 69,008
Non-trainable params: 0
_________________________________________________________________
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
decoder_input (InputLayer)   (None, 16)                0         
_________________________________________________________________
dense_1 (Dense)              (None, 3136)              53312     
_________________________________________________________________
reshape_1 (Reshape)          (None, 7, 7, 64)          0         
_________________________________________________________________
conv2d_transpose_1 (Conv2DTr (None, 14, 14, 64)        36928     
_________________________________________________________________
conv2d_transpose_2 (Conv2DTr (None, 28, 28, 32)        18464     
_________________________________________________________________
conv2d_transpose_3 (Conv2DTr (None, 28, 28, 1)         289       
_________________________________________________________________
decoder_output (Activation)  (None, 28, 28, 1)         0         
=================================================================
Total params: 108,993
Trainable params: 108,993
Non-trainable params: 0
_________________________________________________________________
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
encoder_input (InputLayer)   (None, 28, 28, 1)         0         
_________________________________________________________________
encoder (Model)              (None, 16)                69008     
_________________________________________________________________
decoder (Model)              (None, 28, 28, 1)         108993    
=================================================================
Total params: 178,001
Trainable params: 178,001
Non-trainable params: 0
_________________________________________________________________
Train on 60000 samples, validate on 10000 samples
Epoch 1/30

  128/60000 [..............................] - ETA: 1:19:45 - loss: 0.2307
  256/60000 [..............................] - ETA: 43:59 - loss: 0.2306  
  384/60000 [..............................] - ETA: 29:54 - loss: 0.2300
  512/60000 [..............................] - ETA: 22:39 - loss: 0.2291
  640/60000 [..............................] - ETA: 18:22 - loss: 0.2278
  768/60000 [..............................] - ETA: 15:29 - loss: 0.2260
  896/60000 [..............................] - ETA: 13:29 - loss: 0.2234
 1024/60000 [..............................] - ETA: 11:59 - loss: 0.2196
 1152/60000 [..............................] - ETA: 10:56 - loss: 0.2143
 1280/60000 [..............................] - ETA: 9:59 - loss: 0.2073 
 1408/60000 [..............................] - ETA: 9:17 - loss: 0.1988
 1536/60000 [..............................] - ETA: 8:40 - loss: 0.1902
 1664/60000 [..............................] - ETA: 8:09 - loss: 0.1827
 1792/60000 [..............................] - ETA: 7:43 - loss: 0.1761
 1920/60000 [..............................] - ETA: 7:23 - loss: 0.1701
 2048/60000 [>.............................] - ETA: 7:04 - loss: 0.1649
 2176/60000 [>.............................] - ETA: 6:49 - loss: 0.1603
 2304/60000 [>.............................] - ETA: 6:29 - loss: 0.1562
 2432/60000 [>.............................] - ETA: 6:11 - loss: 0.1520
 2560/60000 [>.............................] - ETA: 5:57 - loss: 0.1482
 2688/60000 [>.............................] - ETA: 5:42 - loss: 0.1450
 2816/60000 [>.............................] - ETA: 5:28 - loss: 0.1420
 2944/60000 [>.............................] - ETA: 5:17 - loss: 0.1389
 3072/60000 [>.............................] - ETA: 5:07 - loss: 0.1360
 3200/60000 [>.............................] - ETA: 4:57 - loss: 0.1335
 3328/60000 [>.............................] - ETA: 4:49 - loss: 0.1311
 3456/60000 [>.............................] - ETA: 4:40 - loss: 0.1289
 3584/60000 [>.............................] - ETA: 4:32 - loss: 0.1267
 3712/60000 [>.............................] - ETA: 4:25 - loss: 0.1247
 3840/60000 [>.............................] - ETA: 4:17 - loss: 0.1228
 3968/60000 [>.............................] - ETA: 4:10 - loss: 0.1211
 4096/60000 [=>............................] - ETA: 4:04 - loss: 0.1195
 4224/60000 [=>............................] - ETA: 3:58 - loss: 0.1181
 4352/60000 [=>............................] - ETA: 3:52 - loss: 0.1167
 4480/60000 [=>............................] - ETA: 3:47 - loss: 0.1153
 4608/60000 [=>............................] - ETA: 3:42 - loss: 0.1140
 4736/60000 [=>............................] - ETA: 3:38 - loss: 0.1128
 4864/60000 [=>............................] - ETA: 3:38 - loss: 0.1116
 4992/60000 [=>............................] - ETA: 3:35 - loss: 0.1105
 5120/60000 [=>............................] - ETA: 3:32 - loss: 0.1093
 5248/60000 [=>............................] - ETA: 3:28 - loss: 0.1083
 5376/60000 [=>............................] - ETA: 3:24 - loss: 0.1074
 5504/60000 [=>............................] - ETA: 3:21 - loss: 0.1064
 5632/60000 [=>............................] - ETA: 3:17 - loss: 0.1056
 5760/60000 [=>............................] - ETA: 3:13 - loss: 0.1047
 5888/60000 [=>............................] - ETA: 3:10 - loss: 0.1039
 6016/60000 [==>...........................] - ETA: 3:07 - loss: 0.1031
 6144/60000 [==>...........................] - ETA: 3:04 - loss: 0.1023
 6272/60000 [==>...........................] - ETA: 3:01 - loss: 0.1016
 6400/60000 [==>...........................] - ETA: 2:58 - loss: 0.1009
 6528/60000 [==>...........................] - ETA: 2:55 - loss: 0.1003
 6656/60000 [==>...........................] - ETA: 2:51 - loss: 0.0997
 6784/60000 [==>...........................] - ETA: 2:49 - loss: 0.0991
 6912/60000 [==>...........................] - ETA: 2:47 - loss: 0.0985
 7040/60000 [==>...........................] - ETA: 2:45 - loss: 0.0979
 7168/60000 [==>...........................] - ETA: 2:44 - loss: 0.0974
 7296/60000 [==>...........................] - ETA: 2:42 - loss: 0.0969
 7424/60000 [==>...........................] - ETA: 2:40 - loss: 0.0964
 7552/60000 [==>...........................] - ETA: 2:37 - loss: 0.0959
 7680/60000 [==>...........................] - ETA: 2:36 - loss: 0.0955
 7808/60000 [==>...........................] - ETA: 2:35 - loss: 0.0950
 7936/60000 [==>...........................] - ETA: 2:33 - loss: 0.0946
 8064/60000 [===>..........................] - ETA: 2:31 - loss: 0.0942
 8192/60000 [===>..........................] - ETA: 2:29 - loss: 0.0938
 8320/60000 [===>..........................] - ETA: 2:27 - loss: 0.0934
 8448/60000 [===>..........................] - ETA: 2:25 - loss: 0.0930
 8576/60000 [===>..........................] - ETA: 2:24 - loss: 0.0927
 8704/60000 [===>..........................] - ETA: 2:22 - loss: 0.0923
 8832/60000 [===>..........................] - ETA: 2:20 - loss: 0.0920
 8960/60000 [===>..........................] - ETA: 2:19 - loss: 0.0916
 9088/60000 [===>..........................] - ETA: 2:17 - loss: 0.0913
 9216/60000 [===>..........................] - ETA: 2:16 - loss: 0.0910
 9344/60000 [===>..........................] - ETA: 2:14 - loss: 0.0907
 9472/60000 [===>..........................] - ETA: 2:13 - loss: 0.0904
 9600/60000 [===>..........................] - ETA: 2:11 - loss: 0.0901
 9728/60000 [===>..........................] - ETA: 2:10 - loss: 0.0899
 9856/60000 [===>..........................] - ETA: 2:09 - loss: 0.0896
 9984/60000 [===>..........................] - ETA: 2:07 - loss: 0.0893
10112/60000 [====>.........................] - ETA: 2:06 - loss: 0.0890
10240/60000 [====>.........................] - ETA: 2:05 - loss: 0.0888
10368/60000 [====>.........................] - ETA: 2:04 - loss: 0.0885
10496/60000 [====>.........................] - ETA: 2:03 - loss: 0.0883
10624/60000 [====>.........................] - ETA: 2:02 - loss: 0.0880
10752/60000 [====>.........................] - ETA: 2:01 - loss: 0.0877
10880/60000 [====>.........................] - ETA: 2:00 - loss: 0.0875
11008/60000 [====>.........................] - ETA: 1:59 - loss: 0.0873
11136/60000 [====>.........................] - ETA: 1:58 - loss: 0.0871
11264/60000 [====>.........................] - ETA: 1:57 - loss: 0.0869
11392/60000 [====>.........................] - ETA: 1:56 - loss: 0.0866
11520/60000 [====>.........................] - ETA: 1:55 - loss: 0.0864
11648/60000 [====>.........................] - ETA: 1:54 - loss: 0.0862
11776/60000 [====>.........................] - ETA: 1:53 - loss: 0.0860
11904/60000 [====>.........................] - ETA: 1:52 - loss: 0.0858
12032/60000 [=====>........................] - ETA: 1:51 - loss: 0.0856
12160/60000 [=====>........................] - ETA: 1:50 - loss: 0.0854
12288/60000 [=====>........................] - ETA: 1:49 - loss: 0.0852
12416/60000 [=====>........................] - ETA: 1:48 - loss: 0.0851
12544/60000 [=====>........................] - ETA: 1:48 - loss: 0.0849
12672/60000 [=====>........................] - ETA: 1:47 - loss: 0.0847
12800/60000 [=====>........................] - ETA: 1:47 - loss: 0.0845
12928/60000 [=====>........................] - ETA: 1:46 - loss: 0.0843
13056/60000 [=====>........................] - ETA: 1:45 - loss: 0.0842
13184/60000 [=====>........................] - ETA: 1:45 - loss: 0.0840
13312/60000 [=====>........................] - ETA: 1:45 - loss: 0.0838
13440/60000 [=====>........................] - ETA: 1:44 - loss: 0.0837
13568/60000 [=====>........................] - ETA: 1:43 - loss: 0.0835
13696/60000 [=====>........................] - ETA: 1:43 - loss: 0.0834
13824/60000 [=====>........................] - ETA: 1:43 - loss: 0.0832
13952/60000 [=====>........................] - ETA: 1:42 - loss: 0.0831
14080/60000 [======>.......................] - ETA: 1:41 - loss: 0.0830
14208/60000 [======>.......................] - ETA: 1:40 - loss: 0.0828
14336/60000 [======>.......................] - ETA: 1:40 - loss: 0.0827
14464/60000 [======>.......................] - ETA: 1:39 - loss: 0.0825
14592/60000 [======>.......................] - ETA: 1:38 - loss: 0.0824
14720/60000 [======>.......................] - ETA: 1:38 - loss: 0.0823
14848/60000 [======>.......................] - ETA: 1:37 - loss: 0.0822
14976/60000 [======>.......................] - ETA: 1:36 - loss: 0.0820
15104/60000 [======>.......................] - ETA: 1:36 - loss: 0.0819
15232/60000 [======>.......................] - ETA: 1:35 - loss: 0.0818
15360/60000 [======>.......................] - ETA: 1:35 - loss: 0.0817
15488/60000 [======>.......................] - ETA: 1:34 - loss: 0.0815
15616/60000 [======>.......................] - ETA: 1:33 - loss: 0.0814
15744/60000 [======>.......................] - ETA: 1:33 - loss: 0.0813
15872/60000 [======>.......................] - ETA: 1:32 - loss: 0.0812
16000/60000 [=======>......................] - ETA: 1:31 - loss: 0.0811
16128/60000 [=======>......................] - ETA: 1:31 - loss: 0.0809
16256/60000 [=======>......................] - ETA: 1:31 - loss: 0.0808
16384/60000 [=======>......................] - ETA: 1:30 - loss: 0.0807
16512/60000 [=======>......................] - ETA: 1:29 - loss: 0.0806
16640/60000 [=======>......................] - ETA: 1:29 - loss: 0.0805
16768/60000 [=======>......................] - ETA: 1:28 - loss: 0.0804
16896/60000 [=======>......................] - ETA: 1:28 - loss: 0.0803
17024/60000 [=======>......................] - ETA: 1:29 - loss: 0.0803
17152/60000 [=======>......................] - ETA: 1:28 - loss: 0.0802
17280/60000 [=======>......................] - ETA: 1:28 - loss: 0.0801
17408/60000 [=======>......................] - ETA: 1:28 - loss: 0.0800
17536/60000 [=======>......................] - ETA: 1:27 - loss: 0.0799
17664/60000 [=======>......................] - ETA: 1:27 - loss: 0.0798
17792/60000 [=======>......................] - ETA: 1:26 - loss: 0.0797
17920/60000 [=======>......................] - ETA: 1:26 - loss: 0.0796
18048/60000 [========>.....................] - ETA: 1:25 - loss: 0.0795
18176/60000 [========>.....................] - ETA: 1:25 - loss: 0.0794
18304/60000 [========>.....................] - ETA: 1:24 - loss: 0.0794
18432/60000 [========>.....................] - ETA: 1:24 - loss: 0.0793
18560/60000 [========>.....................] - ETA: 1:24 - loss: 0.0792
18688/60000 [========>.....................] - ETA: 1:23 - loss: 0.0791
18816/60000 [========>.....................] - ETA: 1:23 - loss: 0.0791
18944/60000 [========>.....................] - ETA: 1:22 - loss: 0.0790
19072/60000 [========>.....................] - ETA: 1:21 - loss: 0.0789
19200/60000 [========>.....................] - ETA: 1:21 - loss: 0.0788
19328/60000 [========>.....................] - ETA: 1:20 - loss: 0.0788
19456/60000 [========>.....................] - ETA: 1:20 - loss: 0.0787
19584/60000 [========>.....................] - ETA: 1:20 - loss: 0.0786
19712/60000 [========>.....................] - ETA: 1:19 - loss: 0.0785
19840/60000 [========>.....................] - ETA: 1:19 - loss: 0.0785
19968/60000 [========>.....................] - ETA: 1:18 - loss: 0.0784
20096/60000 [=========>....................] - ETA: 1:18 - loss: 0.0783
20224/60000 [=========>....................] - ETA: 1:18 - loss: 0.0783
20352/60000 [=========>....................] - ETA: 1:17 - loss: 0.0782
20480/60000 [=========>....................] - ETA: 1:17 - loss: 0.0781
20608/60000 [=========>....................] - ETA: 1:16 - loss: 0.0781
20736/60000 [=========>....................] - ETA: 1:16 - loss: 0.0780
20864/60000 [=========>....................] - ETA: 1:15 - loss: 0.0779
20992/60000 [=========>....................] - ETA: 1:15 - loss: 0.0779
21120/60000 [=========>....................] - ETA: 1:15 - loss: 0.0778
21248/60000 [=========>....................] - ETA: 1:15 - loss: 0.0778
21376/60000 [=========>....................] - ETA: 1:14 - loss: 0.0777
21504/60000 [=========>....................] - ETA: 1:14 - loss: 0.0776
21632/60000 [=========>....................] - ETA: 1:13 - loss: 0.0776
21760/60000 [=========>....................] - ETA: 1:13 - loss: 0.0775
21888/60000 [=========>....................] - ETA: 1:12 - loss: 0.0775
22016/60000 [==========>...................] - ETA: 1:12 - loss: 0.0774
22144/60000 [==========>...................] - ETA: 1:12 - loss: 0.0773
22272/60000 [==========>...................] - ETA: 1:11 - loss: 0.0773
22400/60000 [==========>...................] - ETA: 1:11 - loss: 0.0772
22528/60000 [==========>...................] - ETA: 1:10 - loss: 0.0771
22656/60000 [==========>...................] - ETA: 1:10 - loss: 0.0771
22784/60000 [==========>...................] - ETA: 1:09 - loss: 0.0770
22912/60000 [==========>...................] - ETA: 1:09 - loss: 0.0770
23040/60000 [==========>...................] - ETA: 1:09 - loss: 0.0769
23168/60000 [==========>...................] - ETA: 1:08 - loss: 0.0768
23296/60000 [==========>...................] - ETA: 1:08 - loss: 0.0768
23424/60000 [==========>...................] - ETA: 1:07 - loss: 0.0767
23552/60000 [==========>...................] - ETA: 1:07 - loss: 0.0767
23680/60000 [==========>...................] - ETA: 1:07 - loss: 0.0766
23808/60000 [==========>...................] - ETA: 1:06 - loss: 0.0765
23936/60000 [==========>...................] - ETA: 1:06 - loss: 0.0765
24064/60000 [===========>..................] - ETA: 1:05 - loss: 0.0764
24192/60000 [===========>..................] - ETA: 1:05 - loss: 0.0764
24320/60000 [===========>..................] - ETA: 1:04 - loss: 0.0763
24448/60000 [===========>..................] - ETA: 1:04 - loss: 0.0763
24576/60000 [===========>..................] - ETA: 1:04 - loss: 0.0762
24704/60000 [===========>..................] - ETA: 1:03 - loss: 0.0761
24832/60000 [===========>..................] - ETA: 1:03 - loss: 0.0761
24960/60000 [===========>..................] - ETA: 1:02 - loss: 0.0760
25088/60000 [===========>..................] - ETA: 1:02 - loss: 0.0760
25216/60000 [===========>..................] - ETA: 1:01 - loss: 0.0759
25344/60000 [===========>..................] - ETA: 1:01 - loss: 0.0758
25472/60000 [===========>..................] - ETA: 1:01 - loss: 0.0758
25600/60000 [===========>..................] - ETA: 1:00 - loss: 0.0757
25728/60000 [===========>..................] - ETA: 1:00 - loss: 0.0757
25856/60000 [===========>..................] - ETA: 59s - loss: 0.0756 
25984/60000 [===========>..................] - ETA: 59s - loss: 0.0755
26112/60000 [============>.................] - ETA: 59s - loss: 0.0755
26240/60000 [============>.................] - ETA: 58s - loss: 0.0754
26368/60000 [============>.................] - ETA: 58s - loss: 0.0754
26496/60000 [============>.................] - ETA: 57s - loss: 0.0753
26624/60000 [============>.................] - ETA: 57s - loss: 0.0753
26752/60000 [============>.................] - ETA: 57s - loss: 0.0752
26880/60000 [============>.................] - ETA: 56s - loss: 0.0752
27008/60000 [============>.................] - ETA: 56s - loss: 0.0751
27136/60000 [============>.................] - ETA: 55s - loss: 0.0751
27264/60000 [============>.................] - ETA: 55s - loss: 0.0750
27392/60000 [============>.................] - ETA: 55s - loss: 0.0750
27520/60000 [============>.................] - ETA: 54s - loss: 0.0749
27648/60000 [============>.................] - ETA: 54s - loss: 0.0749
27776/60000 [============>.................] - ETA: 53s - loss: 0.0748
27904/60000 [============>.................] - ETA: 53s - loss: 0.0748
28032/60000 [=============>................] - ETA: 53s - loss: 0.0747
28160/60000 [=============>................] - ETA: 52s - loss: 0.0747
28288/60000 [=============>................] - ETA: 52s - loss: 0.0746
28416/60000 [=============>................] - ETA: 52s - loss: 0.0746
28544/60000 [=============>................] - ETA: 51s - loss: 0.0745
28672/60000 [=============>................] - ETA: 51s - loss: 0.0745
28800/60000 [=============>................] - ETA: 51s - loss: 0.0744
28928/60000 [=============>................] - ETA: 50s - loss: 0.0744
29056/60000 [=============>................] - ETA: 50s - loss: 0.0743
29184/60000 [=============>................] - ETA: 49s - loss: 0.0743
29440/60000 [=============>................] - ETA: 49s - loss: 0.0742
29696/60000 [=============>................] - ETA: 48s - loss: 0.0741
29952/60000 [=============>................] - ETA: 47s - loss: 0.0740
30080/60000 [==============>...............] - ETA: 47s - loss: 0.0739
30208/60000 [==============>...............] - ETA: 47s - loss: 0.0739
30464/60000 [==============>...............] - ETA: 46s - loss: 0.0738
30720/60000 [==============>...............] - ETA: 45s - loss: 0.0737
30848/60000 [==============>...............] - ETA: 45s - loss: 0.0736
30976/60000 [==============>...............] - ETA: 44s - loss: 0.0736
31232/60000 [==============>...............] - ETA: 44s - loss: 0.0735
31488/60000 [==============>...............] - ETA: 43s - loss: 0.0734
31744/60000 [==============>...............] - ETA: 42s - loss: 0.0733
31872/60000 [==============>...............] - ETA: 42s - loss: 0.0732
32128/60000 [===============>..............] - ETA: 41s - loss: 0.0732
32384/60000 [===============>..............] - ETA: 41s - loss: 0.0731
32512/60000 [===============>..............] - ETA: 41s - loss: 0.0730
32640/60000 [===============>..............] - ETA: 40s - loss: 0.0730
32896/60000 [===============>..............] - ETA: 40s - loss: 0.0729
33152/60000 [===============>..............] - ETA: 39s - loss: 0.0728
33408/60000 [===============>..............] - ETA: 38s - loss: 0.0727
33664/60000 [===============>..............] - ETA: 38s - loss: 0.0726
33920/60000 [===============>..............] - ETA: 37s - loss: 0.0725
34176/60000 [================>.............] - ETA: 37s - loss: 0.0724
34304/60000 [================>.............] - ETA: 36s - loss: 0.0724
34560/60000 [================>.............] - ETA: 36s - loss: 0.0723
34816/60000 [================>.............] - ETA: 35s - loss: 0.0722
35072/60000 [================>.............] - ETA: 35s - loss: 0.0721
35200/60000 [================>.............] - ETA: 34s - loss: 0.0720
35456/60000 [================>.............] - ETA: 34s - loss: 0.0719
35712/60000 [================>.............] - ETA: 33s - loss: 0.0719
35840/60000 [================>.............] - ETA: 33s - loss: 0.0718
35968/60000 [================>.............] - ETA: 33s - loss: 0.0718
36224/60000 [=================>............] - ETA: 32s - loss: 0.0717
36480/60000 [=================>............] - ETA: 32s - loss: 0.0716
36736/60000 [=================>............] - ETA: 31s - loss: 0.0715
36864/60000 [=================>............] - ETA: 31s - loss: 0.0714
37120/60000 [=================>............] - ETA: 30s - loss: 0.0713
37376/60000 [=================>............] - ETA: 30s - loss: 0.0712
37504/60000 [=================>............] - ETA: 30s - loss: 0.0711
37760/60000 [=================>............] - ETA: 29s - loss: 0.0711
38016/60000 [==================>...........] - ETA: 29s - loss: 0.0710
38272/60000 [==================>...........] - ETA: 28s - loss: 0.0709
38528/60000 [==================>...........] - ETA: 28s - loss: 0.0708
38784/60000 [==================>...........] - ETA: 27s - loss: 0.0707
39040/60000 [==================>...........] - ETA: 27s - loss: 0.0706
39296/60000 [==================>...........] - ETA: 26s - loss: 0.0705
39552/60000 [==================>...........] - ETA: 26s - loss: 0.0703
39808/60000 [==================>...........] - ETA: 25s - loss: 0.0702
40064/60000 [===================>..........] - ETA: 25s - loss: 0.0701
40320/60000 [===================>..........] - ETA: 24s - loss: 0.0700
40576/60000 [===================>..........] - ETA: 24s - loss: 0.0699
40832/60000 [===================>..........] - ETA: 24s - loss: 0.0698
41088/60000 [===================>..........] - ETA: 23s - loss: 0.0697
41344/60000 [===================>..........] - ETA: 23s - loss: 0.0696
41600/60000 [===================>..........] - ETA: 22s - loss: 0.0694
41856/60000 [===================>..........] - ETA: 22s - loss: 0.0693
42112/60000 [====================>.........] - ETA: 21s - loss: 0.0692
42368/60000 [====================>.........] - ETA: 21s - loss: 0.0690
42624/60000 [====================>.........] - ETA: 21s - loss: 0.0689
42880/60000 [====================>.........] - ETA: 20s - loss: 0.0688
43136/60000 [====================>.........] - ETA: 20s - loss: 0.0687
43392/60000 [====================>.........] - ETA: 19s - loss: 0.0685
43648/60000 [====================>.........] - ETA: 19s - loss: 0.0684
43904/60000 [====================>.........] - ETA: 19s - loss: 0.0682
44160/60000 [=====================>........] - ETA: 18s - loss: 0.0681
44416/60000 [=====================>........] - ETA: 18s - loss: 0.0680
44672/60000 [=====================>........] - ETA: 17s - loss: 0.0678
44800/60000 [=====================>........] - ETA: 17s - loss: 0.0678
45056/60000 [=====================>........] - ETA: 17s - loss: 0.0676
45312/60000 [=====================>........] - ETA: 17s - loss: 0.0675
45568/60000 [=====================>........] - ETA: 16s - loss: 0.0674
45824/60000 [=====================>........] - ETA: 16s - loss: 0.0672
46080/60000 [======================>.......] - ETA: 15s - loss: 0.0671
46336/60000 [======================>.......] - ETA: 15s - loss: 0.0670
46592/60000 [======================>.......] - ETA: 15s - loss: 0.0668
46848/60000 [======================>.......] - ETA: 14s - loss: 0.0667
47104/60000 [======================>.......] - ETA: 14s - loss: 0.0666
47360/60000 [======================>.......] - ETA: 14s - loss: 0.0664
47616/60000 [======================>.......] - ETA: 13s - loss: 0.0663
47872/60000 [======================>.......] - ETA: 13s - loss: 0.0662
48128/60000 [=======================>......] - ETA: 13s - loss: 0.0660
48384/60000 [=======================>......] - ETA: 12s - loss: 0.0659
48640/60000 [=======================>......] - ETA: 12s - loss: 0.0658
48768/60000 [=======================>......] - ETA: 12s - loss: 0.0657
48896/60000 [=======================>......] - ETA: 12s - loss: 0.0656
49152/60000 [=======================>......] - ETA: 11s - loss: 0.0655
49408/60000 [=======================>......] - ETA: 11s - loss: 0.0654
49664/60000 [=======================>......] - ETA: 11s - loss: 0.0652
49920/60000 [=======================>......] - ETA: 10s - loss: 0.0651
50176/60000 [========================>.....] - ETA: 10s - loss: 0.0650
50432/60000 [========================>.....] - ETA: 10s - loss: 0.0649
50688/60000 [========================>.....] - ETA: 9s - loss: 0.0647 
50944/60000 [========================>.....] - ETA: 9s - loss: 0.0646
51200/60000 [========================>.....] - ETA: 9s - loss: 0.0645
51456/60000 [========================>.....] - ETA: 9s - loss: 0.0644
51712/60000 [========================>.....] - ETA: 8s - loss: 0.0642
51840/60000 [========================>.....] - ETA: 8s - loss: 0.0642
52096/60000 [=========================>....] - ETA: 8s - loss: 0.0640
52352/60000 [=========================>....] - ETA: 7s - loss: 0.0639
52608/60000 [=========================>....] - ETA: 7s - loss: 0.0638
52864/60000 [=========================>....] - ETA: 7s - loss: 0.0637
53120/60000 [=========================>....] - ETA: 7s - loss: 0.0635
53376/60000 [=========================>....] - ETA: 6s - loss: 0.0634
53632/60000 [=========================>....] - ETA: 6s - loss: 0.0633
53888/60000 [=========================>....] - ETA: 6s - loss: 0.0632
54144/60000 [==========================>...] - ETA: 5s - loss: 0.0630
54400/60000 [==========================>...] - ETA: 5s - loss: 0.0629
54656/60000 [==========================>...] - ETA: 5s - loss: 0.0628
54912/60000 [==========================>...] - ETA: 5s - loss: 0.0627
55168/60000 [==========================>...] - ETA: 4s - loss: 0.0626
55424/60000 [==========================>...] - ETA: 4s - loss: 0.0624
55680/60000 [==========================>...] - ETA: 4s - loss: 0.0623
55936/60000 [==========================>...] - ETA: 4s - loss: 0.0622
56192/60000 [===========================>..] - ETA: 3s - loss: 0.0621
56448/60000 [===========================>..] - ETA: 3s - loss: 0.0620
56704/60000 [===========================>..] - ETA: 3s - loss: 0.0619
56960/60000 [===========================>..] - ETA: 2s - loss: 0.0617
57216/60000 [===========================>..] - ETA: 2s - loss: 0.0616
57472/60000 [===========================>..] - ETA: 2s - loss: 0.0615
57728/60000 [===========================>..] - ETA: 2s - loss: 0.0614
57984/60000 [===========================>..] - ETA: 1s - loss: 0.0613
58240/60000 [============================>.] - ETA: 1s - loss: 0.0612
58496/60000 [============================>.] - ETA: 1s - loss: 0.0611
58752/60000 [============================>.] - ETA: 1s - loss: 0.0609
59008/60000 [============================>.] - ETA: 0s - loss: 0.0608
59264/60000 [============================>.] - ETA: 0s - loss: 0.0607
59392/60000 [============================>.] - ETA: 0s - loss: 0.0607
59648/60000 [============================>.] - ETA: 0s - loss: 0.0605
59904/60000 [============================>.] - ETA: 0s - loss: 0.0604
60000/60000 [==============================] - 59s 976us/step - loss: 0.0604 - val_loss: 0.0341
Epoch 2/30

  128/60000 [..............................] - ETA: 3:02 - loss: 0.0341
  384/60000 [..............................] - ETA: 1:11 - loss: 0.0342

59264/60000 [============================>.] - ETA: 0s - loss: 0.0143
59520/60000 [============================>.] - ETA: 0s - loss: 0.0143
59776/60000 [============================>.] - ETA: 0s - loss: 0.0143
60000/60000 [==============================] - 16s 261us/step - loss: 0.0143 - val_loss: 0.0155

Process finished with exit code 0

Keras详细介绍

英文:https://keras.io/

中文:http://keras-cn.readthedocs.io/en/latest/

实例下载

https://github.com/keras-team/keras

https://github.com/keras-team/keras/tree/master/examples

完整项目下载

方便没积分童鞋,请加企鹅452205574,共享文件夹。

包括:代码、数据集合(图片)、已生成model、安装库文件等。

你可能感兴趣的:(python,人工智能,深度学习,keras)