今天的内容主要是训练一个分类的机器学习代码,现在我们给这些坐标点预先分号类,然后让计算机学习分类。
坐标点如下:
第一类:
[1, 5], [2, 7], [9, 14], [6, 10], [8, 21], [16, 19],
第二类:
[5, 1], [7, 2], [14, 9], [10, 6], [21, 8],[19, 16]]
下面要给这两类打标签,
第一类:
[1], [1], [1], [1], [1], [1],
第二类:
[0], [0], [0], [0], [0], [0]
可以看出我们这两类点是以y=x为界限分开的,那么通过机器学习能不能学习到特征呢?
不多说废话边看代码,边说:
import numpy as np
from keras.models import Sequential
from keras.layers import Dense, Activation
from keras import optimizers
# collect the training data
x_train = np.array([[1, 5], [2, 7], [9, 14], [6, 10], [8, 21], [16, 19],
[5, 1], [7, 2], [14, 9], [10, 6], [21, 8],[19, 16]])
y_train = np.array([[1], [1], [1], [1], [1], [1],
[0], [0], [0], [0], [0], [0]])
print x_train
print y_train
# design the model
model = Sequential()
model.add(Dense(1, input_dim=2, activation=None, use_bias=False))
model.add(Activation('sigmoid'))
# compile the model and pick the optimizer and loss function
ada = optimizers.Adagrad(lr=0.1, epsilon=1e-8)
model.compile(optimizer=ada, loss='binary_crossentropy', metrics=['accuracy'])
# training the model
print 'training'
model.fit(x_train, y_train, batch_size=4, epochs=100, shuffle=True)
model.fit(x_train, y_train, batch_size=12, epochs=100, shuffle=True)
# test the model
test_ans = model.predict(np.array([[2, 20], [20, 2]]), batch_size=2)
print 'model_weight'
print model.layers[0].get_weights()
print 'ans'
print test_ans
好的,代码已经贴出来了,一段一段解释:
导入相关模块:
import numpy as np
from keras.models import Sequential
from keras.layers import Dense, Activation
from keras import optimizers
因为很多模块在“第一天的Keras”都说的很详细了,所以就讲解我们新用到的模块。
有一个比较新的模块是Activation,这个模块跟Dense层有一点不同,这个层主要是对上一个层的输出数据用一个函数对每一维进行一下处理。后面我们再来观察怎么用。
另一个新用到的模块是optimizers这一个模块主要是定制自己的优化器,说是定制,其实就是更改一些参数。让其更适合我们的模型。后面我们再看怎么用。
收集训练数据:
# collect the training data
x_train = np.array([[1, 5], [2, 7], [9, 14], [6, 10], [8, 21], [16, 19],
[5, 1], [7, 2], [14, 9], [10, 6], [21, 8],[19, 16]])
y_train = np.array([[1], [1], [1], [1], [1], [1],
[0], [0], [0], [0], [0], [0]])
print x_train
print y_train
“第一天的Keras”基本都已经说了,也没有什么新的数据,另外也可以用numpy随机生成一点数据,但随即生成的数据要经过一些处理,才能用于训练,为了省事,我就随便写了,几个数据。
设计我们的模型
# design the model
model = Sequential()
model.add(Dense(1, input_dim=2, activation=None, use_bias=False))
model.add(Activation('sigmoid'))
首先,新建一个空白的model:model = Sequential()
然后设计我们神经网络的输入层:
Dense层,输出维度一维,输入维度input_dim = 2
(两维),层内没有激活函数,所以activation = None
.
另外因为我们训练模型的特殊性,我们这里用不到bias,所以use_bias = False
.
当然也可以加bias,看个人心情。
另外输出层用到了Activation层,这里主要是拟合概率,让所有的值落到0~1之间,另外数学好的同学,都知道我们的概率模型都能简化为sigmoid()函数形式。
编译我们的模型
# compile the model and pick the optimizer and loss function
ada = optimizers.Adagrad(lr=0.1, epsilon=1e-8)
model.compile(optimizer=ada, loss='binary_crossentropy', metrics=['accuracy'])
前面导入的optimizers这里就用到了,我们要用一个叫做AdaGrad的优化器,现在仍有一个问题就是,AdaGrad的一些参数有些不合适,我们要自行修改一下。
ada = optimizers.Adagrad(lr=0.1, epsilon=1e-8)
lr是learing_rate原来默认是0.01,但我训练了200轮正确率还是只有0.5,所以我这里设的大一点,增加学习速率。
epsilon是小量建议设成非常小的数字,防止除零,为什么要有这个东西,参考AdaGrad优化器数学公式就知道了。
https://blog.slinuxer.com/2016/09/sgd-comparison
这个链接介绍了各种优化器,有疑惑可以参考这里。
对于损失函数我们有了一个变动:
在这里就不要用mean_square_error了,这个优化器,对此已经失效了,为什么失效,这是数学问题,就不多探讨了。
大家只要知道这里要用binary_croosentropy就好了。
另一个东西是metrics,指标,我们可以看到指标我们加了‘accuracy’,这样训练的时候,就会输出准确度这个指标供我们参考。另外我们以可以自定义指标,好奇的可以查看官方文档https://keras-cn.readthedocs.io/en/latest/getting_started/sequential_model/
训练我们的模型:
# training the model
print 'training'
model.fit(x_train, y_train, batch_size=4, epochs=100, shuffle=True)
model.fit(x_train, y_train, batch_size=12, epochs=100, shuffle=True)
这里就是训练了,我们这里比“第一天的Keras”多了一个shuffle,解释一下:
shuffle 就是打乱数据,因为我们开始的时候数据很整齐,但为了训练效果我们要模拟随机的情况,所以这里就shuffle了。
看看我们的训练效果:
# test the model
test_ans = model.predict(np.array([[2, 20], [20, 2]]), batch_size=2)
print 'model_weight'
print model.layers[0].get_weights()
print 'ans'
print test_ans
这里本可以用evaluate方法评估一下,但是评估还有写测试数据。我真的很懒,所记省了哈。
好了,我么通过打印的数据可以看到训练效果不错。
第一个点概率是0.999...,第二个是0.00...所以效果不错,在训练集上accuracy也是100%。
两个weight想加几乎等于零。所以符合我们的分类依据。
看一下训练出的数据
/home/kroossun/miniconda2/bin/python /home/kroossun/PycharmProjects/ML/classfication.py
Using TensorFlow backend.
[[ 1 5]
[ 2 7]
[ 9 14]
[ 6 10]
[ 8 21]
[16 19]
[ 5 1]
[ 7 2]
[14 9]
[10 6]
[21 8]
[19 16]]
[[1]
[1]
[1]
[1]
[1]
[1]
[0]
[0]
[0]
[0]
[0]
[0]]
training
Epoch 1/100
2017-08-28 22:39:49.208856: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use SSE4.1 instructions, but these are available on your machine and could speed up CPU computations.
2017-08-28 22:39:49.208883: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use SSE4.2 instructions, but these are available on your machine and could speed up CPU computations.
2017-08-28 22:39:49.208888: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use AVX instructions, but these are available on your machine and could speed up CPU computations.
2017-08-28 22:39:49.208891: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use AVX2 instructions, but these are available on your machine and could speed up CPU computations.
2017-08-28 22:39:49.208894: W tensorflow/core/platform/cpu_feature_guard.cc:45] The TensorFlow library wasn't compiled to use FMA instructions, but these are available on your machine and could speed up CPU computations.
4/12 [=========>....................] - ETA: 0s - loss: 0.4989 - acc: 0.7500
12/12 [==============================] - 0s - loss: 1.9820 - acc: 0.5000
Epoch 2/100
4/12 [=========>....................] - ETA: 0s - loss: 1.8210 - acc: 0.2500
12/12 [==============================] - 0s - loss: 0.8379 - acc: 0.6667
Epoch 3/100
12/12 [==============================] - 0s - loss: 0.4909 - acc: 0.8333
Epoch 4/100
12/12 [==============================] - 0s - loss: 0.3791 - acc: 0.9167
Epoch 5/100
4/12 [=========>....................] - ETA: 0s - loss: 0.3429 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.3378 - acc: 1.0000
Epoch 6/100
12/12 [==============================] - 0s - loss: 0.3276 - acc: 1.0000
Epoch 7/100
4/12 [=========>....................] - ETA: 0s - loss: 0.3761 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.3254 - acc: 1.0000
Epoch 8/100
12/12 [==============================] - 0s - loss: 0.3209 - acc: 1.0000
Epoch 9/100
4/12 [=========>....................] - ETA: 0s - loss: 0.3402 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.2887 - acc: 1.0000
Epoch 10/100
4/12 [=========>....................] - ETA: 0s - loss: 0.2358 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.3052 - acc: 1.0000
Epoch 11/100
4/12 [=========>....................] - ETA: 0s - loss: 0.3322 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.2738 - acc: 1.0000
Epoch 12/100
4/12 [=========>....................] - ETA: 0s - loss: 0.2268 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.2599 - acc: 1.0000
Epoch 13/100
4/12 [=========>....................] - ETA: 0s - loss: 0.3122 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.2534 - acc: 1.0000
Epoch 14/100
4/12 [=========>....................] - ETA: 0s - loss: 0.2903 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.2497 - acc: 1.0000
Epoch 15/100
4/12 [=========>....................] - ETA: 0s - loss: 0.2919 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.2411 - acc: 1.0000
Epoch 16/100
4/12 [=========>....................] - ETA: 0s - loss: 0.2528 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.2357 - acc: 1.0000
Epoch 17/100
4/12 [=========>....................] - ETA: 0s - loss: 0.2652 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.2263 - acc: 1.0000
Epoch 18/100
4/12 [=========>....................] - ETA: 0s - loss: 0.2866 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.2297 - acc: 1.0000
Epoch 19/100
4/12 [=========>....................] - ETA: 0s - loss: 0.1840 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.2167 - acc: 1.0000
Epoch 20/100
12/12 [==============================] - 0s - loss: 0.2119 - acc: 1.0000
Epoch 21/100
12/12 [==============================] - 0s - loss: 0.2125 - acc: 1.0000
Epoch 22/100
4/12 [=========>....................] - ETA: 0s - loss: 0.2356 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.2055 - acc: 1.0000
Epoch 23/100
4/12 [=========>....................] - ETA: 0s - loss: 0.2458 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.2040 - acc: 1.0000
Epoch 24/100
4/12 [=========>....................] - ETA: 0s - loss: 0.1764 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1953 - acc: 1.0000
Epoch 25/100
4/12 [=========>....................] - ETA: 0s - loss: 0.2271 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1926 - acc: 1.0000
Epoch 26/100
4/12 [=========>....................] - ETA: 0s - loss: 0.2150 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1846 - acc: 1.0000
Epoch 27/100
4/12 [=========>....................] - ETA: 0s - loss: 0.1561 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1929 - acc: 1.0000
Epoch 28/100
4/12 [=========>....................] - ETA: 0s - loss: 0.1726 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1865 - acc: 1.0000
Epoch 29/100
12/12 [==============================] - 0s - loss: 0.1788 - acc: 1.0000
Epoch 30/100
12/12 [==============================] - 0s - loss: 0.1710 - acc: 1.0000
Epoch 31/100
4/12 [=========>....................] - ETA: 0s - loss: 0.1532 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1802 - acc: 1.0000
Epoch 32/100
4/12 [=========>....................] - ETA: 0s - loss: 0.1811 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1678 - acc: 1.0000
Epoch 33/100
4/12 [=========>....................] - ETA: 0s - loss: 0.1224 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1626 - acc: 1.0000
Epoch 34/100
4/12 [=========>....................] - ETA: 0s - loss: 0.2095 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1664 - acc: 1.0000
Epoch 35/100
4/12 [=========>....................] - ETA: 0s - loss: 0.1813 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1583 - acc: 1.0000
Epoch 36/100
4/12 [=========>....................] - ETA: 0s - loss: 0.0990 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1641 - acc: 1.0000
Epoch 37/100
12/12 [==============================] - 0s - loss: 0.1587 - acc: 1.0000
Epoch 38/100
4/12 [=========>....................] - ETA: 0s - loss: 0.1789 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1516 - acc: 1.0000
Epoch 39/100
4/12 [=========>....................] - ETA: 0s - loss: 0.1566 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1503 - acc: 1.0000
Epoch 40/100
4/12 [=========>....................] - ETA: 0s - loss: 0.1853 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1477 - acc: 1.0000
Epoch 41/100
4/12 [=========>....................] - ETA: 0s - loss: 0.1894 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1431 - acc: 1.0000
Epoch 42/100
12/12 [==============================] - 0s - loss: 0.1446 - acc: 1.0000
Epoch 43/100
4/12 [=========>....................] - ETA: 0s - loss: 0.1411 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1438 - acc: 1.0000
Epoch 44/100
12/12 [==============================] - 0s - loss: 0.1393 - acc: 1.0000
Epoch 45/100
12/12 [==============================] - 0s - loss: 0.1359 - acc: 1.0000
Epoch 46/100
4/12 [=========>....................] - ETA: 0s - loss: 0.1131 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1370 - acc: 1.0000
Epoch 47/100
4/12 [=========>....................] - ETA: 0s - loss: 0.1638 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1320 - acc: 1.0000
Epoch 48/100
4/12 [=========>....................] - ETA: 0s - loss: 0.0955 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1348 - acc: 1.0000
Epoch 49/100
4/12 [=========>....................] - ETA: 0s - loss: 0.1208 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1324 - acc: 1.0000
Epoch 50/100
12/12 [==============================] - 0s - loss: 0.1272 - acc: 1.0000
Epoch 51/100
4/12 [=========>....................] - ETA: 0s - loss: 0.0908 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1257 - acc: 1.0000
Epoch 52/100
4/12 [=========>....................] - ETA: 0s - loss: 0.1060 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1287 - acc: 1.0000
Epoch 53/100
4/12 [=========>....................] - ETA: 0s - loss: 0.1581 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1232 - acc: 1.0000
Epoch 54/100
4/12 [=========>....................] - ETA: 0s - loss: 0.1182 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1247 - acc: 1.0000
Epoch 55/100
12/12 [==============================] - 0s - loss: 0.1204 - acc: 1.0000
Epoch 56/100
4/12 [=========>....................] - ETA: 0s - loss: 0.1415 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1212 - acc: 1.0000
Epoch 57/100
4/12 [=========>....................] - ETA: 0s - loss: 0.1370 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1194 - acc: 1.0000
Epoch 58/100
4/12 [=========>....................] - ETA: 0s - loss: 0.1475 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1186 - acc: 1.0000
Epoch 59/100
12/12 [==============================] - 0s - loss: 0.1177 - acc: 1.0000
Epoch 60/100
4/12 [=========>....................] - ETA: 0s - loss: 0.0460 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1141 - acc: 1.0000
Epoch 61/100
4/12 [=========>....................] - ETA: 0s - loss: 0.0797 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1118 - acc: 1.0000
Epoch 62/100
4/12 [=========>....................] - ETA: 0s - loss: 0.0996 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1134 - acc: 1.0000
Epoch 63/100
4/12 [=========>....................] - ETA: 0s - loss: 0.1095 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1114 - acc: 1.0000
Epoch 64/100
4/12 [=========>....................] - ETA: 0s - loss: 0.0679 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1083 - acc: 1.0000
Epoch 65/100
4/12 [=========>....................] - ETA: 0s - loss: 0.1589 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1073 - acc: 1.0000
Epoch 66/100
4/12 [=========>....................] - ETA: 0s - loss: 0.1041 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1078 - acc: 1.0000
Epoch 67/100
12/12 [==============================] - 0s - loss: 0.1054 - acc: 1.0000
Epoch 68/100
4/12 [=========>....................] - ETA: 0s - loss: 0.0726 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1056 - acc: 1.0000
Epoch 69/100
4/12 [=========>....................] - ETA: 0s - loss: 0.1213 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1062 - acc: 1.0000
Epoch 70/100
12/12 [==============================] - 0s - loss: 0.1043 - acc: 1.0000
Epoch 71/100
4/12 [=========>....................] - ETA: 0s - loss: 0.1015 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1025 - acc: 1.0000
Epoch 72/100
4/12 [=========>....................] - ETA: 0s - loss: 0.1207 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1016 - acc: 1.0000
Epoch 73/100
4/12 [=========>....................] - ETA: 0s - loss: 0.0962 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.1031 - acc: 1.0000
Epoch 74/100
12/12 [==============================] - 0s - loss: 0.1007 - acc: 1.0000
Epoch 75/100
4/12 [=========>....................] - ETA: 0s - loss: 0.0877 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.0991 - acc: 1.0000
Epoch 76/100
4/12 [=========>....................] - ETA: 0s - loss: 0.0760 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.0987 - acc: 1.0000
Epoch 77/100
4/12 [=========>....................] - ETA: 0s - loss: 0.0844 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.0966 - acc: 1.0000
Epoch 78/100
4/12 [=========>....................] - ETA: 0s - loss: 0.0940 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.0961 - acc: 1.0000
Epoch 79/100
4/12 [=========>....................] - ETA: 0s - loss: 0.0933 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.0966 - acc: 1.0000
Epoch 80/100
4/12 [=========>....................] - ETA: 0s - loss: 0.1216 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.0944 - acc: 1.0000
Epoch 81/100
4/12 [=========>....................] - ETA: 0s - loss: 0.0937 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.0955 - acc: 1.0000
Epoch 82/100
12/12 [==============================] - 0s - loss: 0.0939 - acc: 1.0000
Epoch 83/100
4/12 [=========>....................] - ETA: 0s - loss: 0.0913 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.0919 - acc: 1.0000
Epoch 84/100
4/12 [=========>....................] - ETA: 0s - loss: 0.1047 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.0914 - acc: 1.0000
Epoch 85/100
12/12 [==============================] - 0s - loss: 0.0913 - acc: 1.0000
Epoch 86/100
12/12 [==============================] - 0s - loss: 0.0889 - acc: 1.0000
Epoch 87/100
4/12 [=========>....................] - ETA: 0s - loss: 0.0859 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.0896 - acc: 1.0000
Epoch 88/100
12/12 [==============================] - 0s - loss: 0.0900 - acc: 1.0000
Epoch 89/100
12/12 [==============================] - 0s - loss: 0.0884 - acc: 1.0000
Epoch 90/100
4/12 [=========>....................] - ETA: 0s - loss: 0.0580 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.0857 - acc: 1.0000
Epoch 91/100
4/12 [=========>....................] - ETA: 0s - loss: 0.0574 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.0883 - acc: 1.0000
Epoch 92/100
4/12 [=========>....................] - ETA: 0s - loss: 0.0830 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.0844 - acc: 1.0000
Epoch 93/100
12/12 [==============================] - 0s - loss: 0.0863 - acc: 1.0000
Epoch 94/100
12/12 [==============================] - 0s - loss: 0.0839 - acc: 1.0000
Epoch 95/100
4/12 [=========>....................] - ETA: 0s - loss: 0.0555 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.0829 - acc: 1.0000
Epoch 96/100
4/12 [=========>....................] - ETA: 0s - loss: 0.0696 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.0819 - acc: 1.0000
Epoch 97/100
12/12 [==============================] - 0s - loss: 0.0837 - acc: 1.0000
Epoch 98/100
4/12 [=========>....................] - ETA: 0s - loss: 0.0989 - acc: 1.0000
12/12 [==============================] - 0s - loss: 0.0808 - acc: 1.0000
Epoch 99/100
12/12 [==============================] - 0s - loss: 0.0808 - acc: 1.0000
Epoch 100/100
12/12 [==============================] - 0s - loss: 0.0797 - acc: 1.0000
Epoch 1/100
12/12 [==============================] - 0s - loss: 0.0791 - acc: 1.0000
Epoch 2/100
12/12 [==============================] - 0s - loss: 0.0789 - acc: 1.0000
Epoch 3/100
12/12 [==============================] - 0s - loss: 0.0787 - acc: 1.0000
Epoch 4/100
12/12 [==============================] - 0s - loss: 0.0785 - acc: 1.0000
Epoch 5/100
12/12 [==============================] - 0s - loss: 0.0783 - acc: 1.0000
Epoch 6/100
12/12 [==============================] - 0s - loss: 0.0781 - acc: 1.0000
Epoch 7/100
12/12 [==============================] - 0s - loss: 0.0779 - acc: 1.0000
Epoch 8/100
12/12 [==============================] - 0s - loss: 0.0778 - acc: 1.0000
Epoch 9/100
12/12 [==============================] - 0s - loss: 0.0776 - acc: 1.0000
Epoch 10/100
12/12 [==============================] - 0s - loss: 0.0774 - acc: 1.0000
Epoch 11/100
12/12 [==============================] - 0s - loss: 0.0772 - acc: 1.0000
Epoch 12/100
12/12 [==============================] - 0s - loss: 0.0770 - acc: 1.0000
Epoch 13/100
12/12 [==============================] - 0s - loss: 0.0769 - acc: 1.0000
Epoch 14/100
12/12 [==============================] - 0s - loss: 0.0767 - acc: 1.0000
Epoch 15/100
12/12 [==============================] - 0s - loss: 0.0765 - acc: 1.0000
Epoch 16/100
12/12 [==============================] - 0s - loss: 0.0763 - acc: 1.0000
Epoch 17/100
12/12 [==============================] - 0s - loss: 0.0762 - acc: 1.0000
Epoch 18/100
12/12 [==============================] - 0s - loss: 0.0760 - acc: 1.0000
Epoch 19/100
12/12 [==============================] - 0s - loss: 0.0758 - acc: 1.0000
Epoch 20/100
12/12 [==============================] - 0s - loss: 0.0757 - acc: 1.0000
Epoch 21/100
12/12 [==============================] - 0s - loss: 0.0755 - acc: 1.0000
Epoch 22/100
12/12 [==============================] - 0s - loss: 0.0753 - acc: 1.0000
Epoch 23/100
12/12 [==============================] - 0s - loss: 0.0751 - acc: 1.0000
Epoch 24/100
12/12 [==============================] - 0s - loss: 0.0750 - acc: 1.0000
Epoch 25/100
12/12 [==============================] - 0s - loss: 0.0748 - acc: 1.0000
Epoch 26/100
12/12 [==============================] - 0s - loss: 0.0746 - acc: 1.0000
Epoch 27/100
12/12 [==============================] - 0s - loss: 0.0745 - acc: 1.0000
Epoch 28/100
12/12 [==============================] - 0s - loss: 0.0743 - acc: 1.0000
Epoch 29/100
12/12 [==============================] - 0s - loss: 0.0741 - acc: 1.0000
Epoch 30/100
12/12 [==============================] - 0s - loss: 0.0740 - acc: 1.0000
Epoch 31/100
12/12 [==============================] - 0s - loss: 0.0738 - acc: 1.0000
Epoch 32/100
12/12 [==============================] - 0s - loss: 0.0737 - acc: 1.0000
Epoch 33/100
12/12 [==============================] - 0s - loss: 0.0735 - acc: 1.0000
Epoch 34/100
12/12 [==============================] - 0s - loss: 0.0733 - acc: 1.0000
Epoch 35/100
12/12 [==============================] - 0s - loss: 0.0732 - acc: 1.0000
Epoch 36/100
12/12 [==============================] - 0s - loss: 0.0730 - acc: 1.0000
Epoch 37/100
12/12 [==============================] - 0s - loss: 0.0729 - acc: 1.0000
Epoch 38/100
12/12 [==============================] - 0s - loss: 0.0727 - acc: 1.0000
Epoch 39/100
12/12 [==============================] - 0s - loss: 0.0725 - acc: 1.0000
Epoch 40/100
12/12 [==============================] - 0s - loss: 0.0724 - acc: 1.0000
Epoch 41/100
12/12 [==============================] - 0s - loss: 0.0722 - acc: 1.0000
Epoch 42/100
12/12 [==============================] - 0s - loss: 0.0721 - acc: 1.0000
Epoch 43/100
12/12 [==============================] - 0s - loss: 0.0719 - acc: 1.0000
Epoch 44/100
12/12 [==============================] - 0s - loss: 0.0718 - acc: 1.0000
Epoch 45/100
12/12 [==============================] - 0s - loss: 0.0716 - acc: 1.0000
Epoch 46/100
12/12 [==============================] - 0s - loss: 0.0715 - acc: 1.0000
Epoch 47/100
12/12 [==============================] - 0s - loss: 0.0713 - acc: 1.0000
Epoch 48/100
12/12 [==============================] - 0s - loss: 0.0712 - acc: 1.0000
Epoch 49/100
12/12 [==============================] - 0s - loss: 0.0710 - acc: 1.0000
Epoch 50/100
12/12 [==============================] - 0s - loss: 0.0709 - acc: 1.0000
Epoch 51/100
12/12 [==============================] - 0s - loss: 0.0707 - acc: 1.0000
Epoch 52/100
12/12 [==============================] - 0s - loss: 0.0706 - acc: 1.0000
Epoch 53/100
12/12 [==============================] - 0s - loss: 0.0704 - acc: 1.0000
Epoch 54/100
12/12 [==============================] - 0s - loss: 0.0703 - acc: 1.0000
Epoch 55/100
12/12 [==============================] - 0s - loss: 0.0701 - acc: 1.0000
Epoch 56/100
12/12 [==============================] - 0s - loss: 0.0700 - acc: 1.0000
Epoch 57/100
12/12 [==============================] - 0s - loss: 0.0698 - acc: 1.0000
Epoch 58/100
12/12 [==============================] - 0s - loss: 0.0697 - acc: 1.0000
Epoch 59/100
12/12 [==============================] - 0s - loss: 0.0695 - acc: 1.0000
Epoch 60/100
12/12 [==============================] - 0s - loss: 0.0694 - acc: 1.0000
Epoch 61/100
12/12 [==============================] - 0s - loss: 0.0692 - acc: 1.0000
Epoch 62/100
12/12 [==============================] - 0s - loss: 0.0691 - acc: 1.0000
Epoch 63/100
12/12 [==============================] - 0s - loss: 0.0690 - acc: 1.0000
Epoch 64/100
12/12 [==============================] - 0s - loss: 0.0688 - acc: 1.0000
Epoch 65/100
12/12 [==============================] - 0s - loss: 0.0687 - acc: 1.0000
Epoch 66/100
12/12 [==============================] - 0s - loss: 0.0685 - acc: 1.0000
Epoch 67/100
12/12 [==============================] - 0s - loss: 0.0684 - acc: 1.0000
Epoch 68/100
12/12 [==============================] - 0s - loss: 0.0683 - acc: 1.0000
Epoch 69/100
12/12 [==============================] - 0s - loss: 0.0681 - acc: 1.0000
Epoch 70/100
12/12 [==============================] - 0s - loss: 0.0680 - acc: 1.0000
Epoch 71/100
12/12 [==============================] - 0s - loss: 0.0678 - acc: 1.0000
Epoch 72/100
12/12 [==============================] - 0s - loss: 0.0677 - acc: 1.0000
Epoch 73/100
12/12 [==============================] - 0s - loss: 0.0676 - acc: 1.0000
Epoch 74/100
12/12 [==============================] - 0s - loss: 0.0674 - acc: 1.0000
Epoch 75/100
12/12 [==============================] - 0s - loss: 0.0673 - acc: 1.0000
Epoch 76/100
12/12 [==============================] - 0s - loss: 0.0672 - acc: 1.0000
Epoch 77/100
12/12 [==============================] - 0s - loss: 0.0670 - acc: 1.0000
Epoch 78/100
12/12 [==============================] - 0s - loss: 0.0669 - acc: 1.0000
Epoch 79/100
12/12 [==============================] - 0s - loss: 0.0668 - acc: 1.0000
Epoch 80/100
12/12 [==============================] - 0s - loss: 0.0666 - acc: 1.0000
Epoch 81/100
12/12 [==============================] - 0s - loss: 0.0665 - acc: 1.0000
Epoch 82/100
12/12 [==============================] - 0s - loss: 0.0664 - acc: 1.0000
Epoch 83/100
12/12 [==============================] - 0s - loss: 0.0662 - acc: 1.0000
Epoch 84/100
12/12 [==============================] - 0s - loss: 0.0661 - acc: 1.0000
Epoch 85/100
12/12 [==============================] - 0s - loss: 0.0660 - acc: 1.0000
Epoch 86/100
12/12 [==============================] - 0s - loss: 0.0659 - acc: 1.0000
Epoch 87/100
12/12 [==============================] - 0s - loss: 0.0657 - acc: 1.0000
Epoch 88/100
12/12 [==============================] - 0s - loss: 0.0656 - acc: 1.0000
Epoch 89/100
12/12 [==============================] - 0s - loss: 0.0655 - acc: 1.0000
Epoch 90/100
12/12 [==============================] - 0s - loss: 0.0653 - acc: 1.0000
Epoch 91/100
12/12 [==============================] - 0s - loss: 0.0652 - acc: 1.0000
Epoch 92/100
12/12 [==============================] - 0s - loss: 0.0651 - acc: 1.0000
Epoch 93/100
12/12 [==============================] - 0s - loss: 0.0650 - acc: 1.0000
Epoch 94/100
12/12 [==============================] - 0s - loss: 0.0648 - acc: 1.0000
Epoch 95/100
12/12 [==============================] - 0s - loss: 0.0647 - acc: 1.0000
Epoch 96/100
12/12 [==============================] - 0s - loss: 0.0646 - acc: 1.0000
Epoch 97/100
12/12 [==============================] - 0s - loss: 0.0645 - acc: 1.0000
Epoch 98/100
12/12 [==============================] - 0s - loss: 0.0643 - acc: 1.0000
Epoch 99/100
12/12 [==============================] - 0s - loss: 0.0642 - acc: 1.0000
Epoch 100/100
12/12 [==============================] - 0s - loss: 0.0641 - acc: 1.0000
model_weight
[array([[-0.62812889],
[ 0.62607998]], dtype=float32)]
ans
[[ 9.99987245e-01]
[ 1.22445408e-05]]
Process finished with exit code 0