Keras学习之1:简单神经网络

       使用下载在本地的Mnist手写字符数据集,代码很简单,不做过多解释,Keras搭建神经网络确实很方便,几行代码就完成了一个两层的神经网络。

from keras import models
from keras import layers
import numpy as np
from keras.utils import to_categorical

def load_data():
    path = './mnist.npz'
    f = np.load(path)
    x_train, y_train = f['x_train'], f['y_train']
    x_test, y_test = f['x_test'], f['y_test']
    f.close()
    return (x_train, y_train), (x_test, y_test)


if __name__ == '__main__':
    (train_images, train_labels),(test_images, test_labels)=load_data()
    train_images = train_images.reshape((60000,28*28))
    train_images = train_images.astype('float32')/255
    test_images = test_images.reshape((10000,28*28))
    test_images = test_images.astype('float32')/255
    train_labels = to_categorical(train_labels)
    test_labels = to_categorical(test_labels)
    network = models.Sequential()
    network.add(layers.Dense(512,activation='relu',input_shape=(28*28,)))
    network.add(layers.Dense(10,activation='softmax'))
    network.compile(optimizer='rmsprop',loss='categorical_crossentropy',metrics=['accuracy'])
    network.fit(train_images,train_labels,epochs=5,batch_size=128)
最后一个epoch运行结果:
Epoch 5/5

  128/60000 [..............................] - ETA: 4s - loss: 0.0313 - acc: 0.9844
 1024/60000 [..............................] - ETA: 3s - loss: 0.0441 - acc: 0.9902
 1920/60000 [..............................] - ETA: 3s - loss: 0.0337 - acc: 0.9917
 2944/60000 [>.............................] - ETA: 3s - loss: 0.0292 - acc: 0.9922
 3840/60000 [>.............................] - ETA: 3s - loss: 0.0303 - acc: 0.9909
 4736/60000 [=>............................] - ETA: 3s - loss: 0.0287 - acc: 0.9913
 5632/60000 [=>............................] - ETA: 3s - loss: 0.0292 - acc: 0.9913
 6656/60000 [==>...........................] - ETA: 3s - loss: 0.0285 - acc: 0.9917
 7680/60000 [==>...........................] - ETA: 3s - loss: 0.0310 - acc: 0.9906
 8576/60000 [===>..........................] - ETA: 2s - loss: 0.0323 - acc: 0.9903
 9472/60000 [===>..........................] - ETA: 2s - loss: 0.0341 - acc: 0.9899
10368/60000 [====>.........................] - ETA: 2s - loss: 0.0339 - acc: 0.9899
11264/60000 [====>.........................] - ETA: 2s - loss: 0.0342 - acc: 0.9899
12160/60000 [=====>........................] - ETA: 2s - loss: 0.0332 - acc: 0.9905
13056/60000 [=====>........................] - ETA: 2s - loss: 0.0334 - acc: 0.9906
14080/60000 [======>.......................] - ETA: 2s - loss: 0.0322 - acc: 0.9909
15104/60000 [======>.......................] - ETA: 2s - loss: 0.0334 - acc: 0.9901
16000/60000 [=======>......................] - ETA: 2s - loss: 0.0342 - acc: 0.9901
16896/60000 [=======>......................] - ETA: 2s - loss: 0.0339 - acc: 0.9900
17920/60000 [=======>......................] - ETA: 2s - loss: 0.0335 - acc: 0.9903
18944/60000 [========>.....................] - ETA: 2s - loss: 0.0333 - acc: 0.9905
19840/60000 [========>.....................] - ETA: 2s - loss: 0.0337 - acc: 0.9904
20736/60000 [=========>....................] - ETA: 2s - loss: 0.0341 - acc: 0.9903
21760/60000 [=========>....................] - ETA: 2s - loss: 0.0341 - acc: 0.9903
22784/60000 [==========>...................] - ETA: 2s - loss: 0.0340 - acc: 0.9903
23680/60000 [==========>...................] - ETA: 2s - loss: 0.0337 - acc: 0.9903
24704/60000 [===========>..................] - ETA: 2s - loss: 0.0336 - acc: 0.9904
25600/60000 [===========>..................] - ETA: 1s - loss: 0.0347 - acc: 0.9902
26496/60000 [============>.................] - ETA: 1s - loss: 0.0346 - acc: 0.9901
27392/60000 [============>.................] - ETA: 1s - loss: 0.0342 - acc: 0.9903
28288/60000 [=============>................] - ETA: 1s - loss: 0.0339 - acc: 0.9903
29184/60000 [=============>................] - ETA: 1s - loss: 0.0341 - acc: 0.9903
30080/60000 [==============>...............] - ETA: 1s - loss: 0.0347 - acc: 0.9901
31104/60000 [==============>...............] - ETA: 1s - loss: 0.0350 - acc: 0.9899
32000/60000 [===============>..............] - ETA: 1s - loss: 0.0356 - acc: 0.9896
32896/60000 [===============>..............] - ETA: 1s - loss: 0.0356 - acc: 0.9896
33792/60000 [===============>..............] - ETA: 1s - loss: 0.0358 - acc: 0.9895
34560/60000 [================>.............] - ETA: 1s - loss: 0.0360 - acc: 0.9894
35328/60000 [================>.............] - ETA: 1s - loss: 0.0358 - acc: 0.9895
36224/60000 [=================>............] - ETA: 1s - loss: 0.0357 - acc: 0.9895
37120/60000 [=================>............] - ETA: 1s - loss: 0.0362 - acc: 0.9893
37888/60000 [=================>............] - ETA: 1s - loss: 0.0359 - acc: 0.9893
38784/60000 [==================>...........] - ETA: 1s - loss: 0.0359 - acc: 0.9894
39680/60000 [==================>...........] - ETA: 1s - loss: 0.0362 - acc: 0.9893
40576/60000 [===================>..........] - ETA: 1s - loss: 0.0362 - acc: 0.9892
41344/60000 [===================>..........] - ETA: 1s - loss: 0.0363 - acc: 0.9891
42240/60000 [====================>.........] - ETA: 1s - loss: 0.0362 - acc: 0.9891
43136/60000 [====================>.........] - ETA: 1s - loss: 0.0359 - acc: 0.9892
44032/60000 [=====================>........] - ETA: 0s - loss: 0.0358 - acc: 0.9892
44928/60000 [=====================>........] - ETA: 0s - loss: 0.0359 - acc: 0.9892
45824/60000 [=====================>........] - ETA: 0s - loss: 0.0360 - acc: 0.9892
46720/60000 [======================>.......] - ETA: 0s - loss: 0.0360 - acc: 0.9893
47616/60000 [======================>.......] - ETA: 0s - loss: 0.0363 - acc: 0.9892
48512/60000 [=======================>......] - ETA: 0s - loss: 0.0362 - acc: 0.9892
49408/60000 [=======================>......] - ETA: 0s - loss: 0.0361 - acc: 0.9892
50432/60000 [========================>.....] - ETA: 0s - loss: 0.0358 - acc: 0.9894
51328/60000 [========================>.....] - ETA: 0s - loss: 0.0358 - acc: 0.9894
52224/60000 [=========================>....] - ETA: 0s - loss: 0.0361 - acc: 0.9893
53120/60000 [=========================>....] - ETA: 0s - loss: 0.0365 - acc: 0.9891
54016/60000 [==========================>...] - ETA: 0s - loss: 0.0366 - acc: 0.9890
54656/60000 [==========================>...] - ETA: 0s - loss: 0.0366 - acc: 0.9890
55424/60000 [==========================>...] - ETA: 0s - loss: 0.0367 - acc: 0.9890
56320/60000 [===========================>..] - ETA: 0s - loss: 0.0366 - acc: 0.9891
57216/60000 [===========================>..] - ETA: 0s - loss: 0.0368 - acc: 0.9890
58112/60000 [============================>.] - ETA: 0s - loss: 0.0370 - acc: 0.9890
58880/60000 [============================>.] - ETA: 0s - loss: 0.0369 - acc: 0.9890
59648/60000 [============================>.] - ETA: 0s - loss: 0.0369 - acc: 0.9890
60000/60000 [==============================] - 4s 60us/step - loss: 0.0369 - acc: 0.9890



你可能感兴趣的:(机器学习)