吴恩达471机器学习入门课程2第3周——评估和改进模型

评估和提高机器学习模型

    • 1 导包
    • 2 - 评估学习算法(多项式回归)
      • 2.1 划分数据集
        • 2.1.1 绘制训练集和测试集
      • 2.2 模型评估的误差计算,线性回归
      • 2.3 比较训练数据和测试数据上的性能
    • 3 偏差和方差
      • 3.1 绘制训练、交叉验证和测试集
      • 3.2 找到合适的多项式次数
      • 3.3 归一化
      • 3.4 扩大数据集
    • 4 评估神经网络学习算法
      • 4.1 数据集
      • 4.2 通过计算分类错误率来评估分类模型
    • 5 模型复杂度
      • 5.1 复杂模型
      • 5.2 简单模型

探索评估和改进机器学习模型的技术。

1 导包

import numpy as np
%matplotlib widget
import matplotlib.pyplot as plt
from sklearn.linear_model import LinearRegression, Ridge
from sklearn.preprocessing import StandardScaler, PolynomialFeatures
from sklearn.model_selection import train_test_split
from sklearn.metrics import mean_squared_error
import tensorflow as tf
from keras.models import Sequential
from keras.layers import Dense
from keras.activations import relu,linear
from keras.losses import SparseCategoricalCrossentropy
from keras.optimizers import Adam
import logging
logging.getLogger("tensorflow").setLevel(logging.ERROR)

from public_tests_a1 import *

tf.keras.backend.set_floatx('float64')
from assigment_utils import *

tf.autograph.set_verbosity(0)

2 - 评估学习算法(多项式回归)

吴恩达471机器学习入门课程2第3周——评估和改进模型_第1张图片

假设你已经创建了一个机器学习模型,并且发现它非常适合你的训练数据。完成了吗?不完全是。创建模型的目标是能够预测示例的值。

在部署模型之前,如何测试您的模型在新数据上的表现呢?
答案有两个部分:

  • 将原始数据集拆分为“训练”和“测试”集。
    • 使用训练数据来拟合模型参数
    • 使用测试数据来评估模型在数据上的表现
  • 开发一个误差函数来评估您的模型。

2.1 划分数据集

'''
这是一个 Python 函数,用于生成一个基于的X²数据集,并添加噪声。函数的输入参数为 m、seed 和 scale,其中 m 是数据点的数量,seed 是随机种子,scale 是噪声的大小。函数会输出四个变量 x_train、y_train、x_ideal 和 y_ideal。

具体地说,函数首先将 0 到 49 均匀分成 m 段,并生成一个长度为 m 的数组 x_train,该数组包含了用于拟合数据的 x 值。接下来,函数使用给定的随机种子生成一组随机数,并根据这些随机数生成一个长度为 m 的噪声数组,将其加到中X²,得到理想的 y 值 y_ideal。然后,函数在 y_ideal 上加上一定比例的噪声,得到最终的可训练数据 y_train。同时,函数还将 x_train 和 y_ideal 输出,以便在需要时重新绘制数据。
'''
'''
在数据分析和机器学习中,噪声通常指的是数据中不希望出现的随机误差或干扰。这些误差可能来自于测量设备的精度限制、信号传输过程中的干扰、数据采集时的环境噪声等各种因素。

在这个函数中,为了模拟真实数据中的噪声,我们使用了一组随机数并将其加到理想的X²函数值上,得到了最终的可训练数据 y_train。这样做的目的是为了使生成的数据更接近真实场景,并且能够更好地验证模型的鲁棒性和泛化能力。
'''
def gen_data(m, seed=1, scale=0.7):
    """ generate a data set based on a x^2 with added noise """
    c = 0
    x_train = np.linspace(0,49,m)
    np.random.seed(seed)
    y_ideal = x_train**2 + c
    y_train = y_ideal + scale * y_ideal*(np.random.sample((m,))-0.5)
    x_ideal = x_train #for redraw when new data included in X
    return x_train, y_train, x_ideal, y_ideal

X,y,x_ideal,y_ideal = gen_data(18, 2, 0.7)
print("X.shape", X.shape, "y.shape", y.shape)
'''
这段代码用于从数据集中划分出训练集和测试集。其中,X表示特征变量的数据集,y表示目标变量的数据集,test_size=0.33表示将数据集按3:1的比例划分为训练集和测试集,
random_state=1是为了确定随机种子,保证每次划分的结果都一样。
具体来说,该函数会将X和y分别划分为X_train、X_test和y_train、y_test四个数据集,其中X_train和y_train用于训练模型,X_test和y_test用于评估模型的性能。
'''
X_train, X_test, y_train, y_test = train_test_split(X,y,test_size=0.33, random_state=1)
print("X_train.shape", X_train.shape, "y_train.shape", y_train.shape)
print("X_test.shape", X_test.shape, "y_test.shape", y_test.shape)

2.1.1 绘制训练集和测试集

您可以看到,下面的数据点将作为训练的一部分(红色)与模型未经过训练的数据(测试)混合在一起。这个特定的数据集是一个带有噪声的二次函数。参考曲线显示出“理想”的曲线。

fig, ax = plt.subplots(1,1,figsize=(4,4))
ax.plot(x_ideal, y_ideal, "--", color = "orangered", label="y_ideal", lw=1)
ax.set_title("Training, Test",fontsize = 14)
ax.set_xlabel("x")
ax.set_ylabel("y")

ax.scatter(X_train, y_train, color = "red",           label="train")
ax.scatter(X_test, y_test,   color = dlc["dlblue"],   label="test")
ax.legend(loc='upper left')
plt.show()
Figure

2.2 模型评估的误差计算,线性回归

评估线性回归模型时,您需要对预测值和目标值之间的平方误差差异进行平均。公式如下:
J test ( w , b ) = 1 2 m test ∑ i = 0 m test − 1 ( f w , b ( x test ( i ) ) − y test ( i ) ) 2 (1) J_\text{test}(\mathbf{w},b) = \frac{1}{2m_\text{test}}\sum_{i=0}^{m_\text{test}-1} ( f_{\mathbf{w},b}(\mathbf{x}^{(i)}_\text{test}) - y^{(i)}_\text{test} )^2 \tag{1} Jtest(w,b)=2mtest1i=0mtest1(fw,b(xtest(i))ytest(i))2(1)

# 均方差
def eval_mse(y,yhat):
     m = len(y)
     err = 0.0
     for i in range(m):
         err+=np.power(y[i]-yhat[i],2)
     err/=2*m
     return err

2.3 比较训练数据和测试数据上的性能

让我们构建一个高阶多项式模型以最小化训练误差。这将使用sklearn中的linear_regression函数。如果您想查看详细信息,请查看导入的实用程序文件中的代码。以下是步骤:

  • 创建并拟合模型(‘fit’ 是另一个训练或运行梯度下降的名称)。
  • 计算训练数据上的误差。
  • 计算测试数据上的误差。
num = 10
lmodel = lin_model(num)
lmodel.fit(X_train,y_train)

yhat = lmodel.predict(X_train)
err_train = lmodel.mse(y_train, yhat)

yhat = lmodel.predict(X_test)
err_test = lmodel.mse(y_test,yhat)

训练集上的计算误差远远小于测试集的计算误差。

print(f"training err {err_train:0.2f}, test err {err_test:0.2f}")
training err 58.01, test err 171215.01

以下的图表展示了为什么会这样。该模型非常精确地拟合了训练数据,但为此它创建了一个复杂的函数。测试数据不是训练数据的一部分,所以该模型在这些数据上的预测效果很差。
因此,这个模型可以被描述为:1)过度拟合,2)方差高,3)泛化能力差。

x = np.linspace(0,int(X.max()),100)  # predict values for plot
y_pred = lmodel.predict(x).reshape(-1,1)

plt_train_test(X_train, y_train, X_test, y_test, x, y_pred, x_ideal, y_ideal, num)

吴恩达471机器学习入门课程2第3周——评估和改进模型_第2张图片

以下的测试数据误差表明该模型在新数据上效果不佳。如果将测试误差用于指导改进模型,则该模型会在测试数据上表现良好……但是,测试数据旨在代表新的数据。

你需要另一个数据集来测试新数据的表现。

讲座期间提出的建议是将数据分成三组。下表所示的训练、交叉验证和测试集的分布是典型的分布,但可以根据可用数据量进行变化。

数据集 总数的百分比 描述
训练集 60 用于调整模型参数 w w w b b b 的数据,即用于训练或拟合的数据
交叉验证集 20 用于调整其他模型参数,如多项式的次数、正则化或神经网络的架构。
测试集 20 用于在调整后测试模型,以衡量在新数据上的性能

让我们生成三个数据集。我们将再次使用sklearn中的train_test_split,但要调用两次以获取三个拆分:

X,y, x_ideal,y_ideal = gen_data(40, 5, 0.7)
print("X.shape", X.shape, "y.shape", y.shape)
X_train, X_, y_train, y_ = train_test_split(X,y,test_size=0.40, random_state=1)
X_cv, X_test, y_cv, y_test = train_test_split(X_,y_,test_size=0.50, random_state=1)
print("X_train.shape", X_train.shape, "y_train.shape", y_train.shape)
print("X_cv.shape", X_cv.shape, "y_cv.shape", y_cv.shape)
print("X_test.shape", X_test.shape, "y_test.shape", y_test.shape)
X.shape (40,) y.shape (40,)
X_train.shape (24,) y_train.shape (24,)
X_cv.shape (8,) y_cv.shape (8,)
X_test.shape (8,) y_test.shape (8,)

3 偏差和方差

以上,很明显多项式模型的次数太高了。你如何选择一个好的值?事实证明,正如图表所示,训练和交叉验证的表现可以提供指导。通过尝试一系列次数值,可以评估训练和交叉验证的表现。随着次数变得过大,交叉验证表现将开始相对于训练表现而降低。我们在例子中试一试这个方法。

吴恩达471机器学习入门课程2第3周——评估和改进模型_第3张图片

3.1 绘制训练、交叉验证和测试集

您可以在下面看到将成为训练数据点的数据(以红色表示)与模型未经过训练的数据(测试集和交叉验证集)混合在一起的情况。

fig, ax = plt.subplots(1,1,figsize=(4,4))
ax.plot(x_ideal, y_ideal, "--", color = "orangered", label="y_ideal", lw=1)
ax.set_title("Training, CV, Test",fontsize = 14)
ax.set_xlabel("x")
ax.set_ylabel("y")

ax.scatter(X_train, y_train, color = "red",           label="train")
ax.scatter(X_cv, y_cv,       color = dlc["dlorange"], label="cv")
ax.scatter(X_test, y_test,   color = dlc["dlblue"],   label="test")
ax.legend(loc='upper left')
plt.show()

吴恩达471机器学习入门课程2第3周——评估和改进模型_第4张图片

3.2 找到合适的多项式次数

在之前的实验中,你发现可以通过利用多项式(见第1门课程第2周特征工程和多项式回归实验)创建一个能够拟合复杂曲线的模型。此外,你还证明了通过增加多项式次数,可以创建过度拟合的情况(见第1门课程第3周过度拟合实验)。让我们利用这些知识来测试我们区分过拟合和欠拟合的能力。

我们将重复训练模型,每次迭代增加多项式次数。在这里,我们将使用scikit-learn线性回归模型以提高速度和简便性。

max_degree = 9
err_train = np.zeros(max_degree)
err_cv = np.zeros(max_degree)
x = np.linspace(0,int(X.max()),100)
y_pred = np.zeros((100,max_degree))  #columns are lines to plot

for degree in range(max_degree):
    lmodel = lin_model(degree+1)
    lmodel.fit(X_train, y_train)
    yhat = lmodel.predict(X_train)
    err_train[degree] = lmodel.mse(y_train, yhat)
    yhat = lmodel.predict(X_cv)
    err_cv[degree] = lmodel.mse(y_cv, yhat)
    y_pred[:,degree] = lmodel.predict(x)

optimal_degree = np.argmin(err_cv)+1

Let’s plot the result:

plt.close("all")
plt_optimal_degree(X_train, y_train, X_cv, y_cv, x, y_pred, x_ideal, y_ideal,
                   err_train, err_cv, optimal_degree, max_degree)

吴恩达471机器学习入门课程2第3周——评估和改进模型_第5张图片

上图展示了将数据分为两组,一组用于训练模型,另一组未被训练的用于确定模型是否欠拟合或过拟合的方法。在我们的例子中,我们通过增加多项式次数来创建从欠拟合到过拟合的多种模型。

  • 在左图中,实线表示这些模型的预测。多项式次数为1的模型产生一条穿过很少数据点的直线,而最大次数则非常接近每个数据点。
  • 在右图中:
    • 训练数据误差(蓝色)随着模型复杂度的增加而如预期般减少
    • 交叉验证数据误差最初随着模型开始符合数据而下降,但随着模型开始过度拟合训练数据(未能推广),误差会增加。

值得注意的是,这些示例中的曲线并不像演讲时可能画得那么平滑。明显,分配给每个组的具体数据点可以显著改变你的结果。总趋势才是重要的。

3.3 归一化

在之前的实验中,您已经利用正则化来减少过度拟合。与度数类似,可以使用相同的方法来调整正则化参数 lambda ( λ \lambda λ)。 让我们从高次多项式开始并改变正则化参数来演示这一点。

lambda_range = np.array([0.0, 1e-6, 1e-5, 1e-4,1e-3,1e-2, 1e-1,1,10,100])
num_steps = len(lambda_range)
degree = 10
err_train = np.zeros(num_steps)
err_cv = np.zeros(num_steps)
x = np.linspace(0,int(X.max()),100)
y_pred = np.zeros((100,num_steps))  #columns are lines to plot

for i in range(num_steps):
    lambda_= lambda_range[i]
    lmodel = lin_model(degree, regularization=True, lambda_=lambda_)
    lmodel.fit(X_train, y_train)
    yhat = lmodel.predict(X_train)
    err_train[i] = lmodel.mse(y_train, yhat)
    yhat = lmodel.predict(X_cv)
    err_cv[i] = lmodel.mse(y_cv, yhat)
    y_pred[:,i] = lmodel.predict(x)

optimal_reg_idx = np.argmin(err_cv)
plt.close("all")
plt_tune_regularization(X_train, y_train, X_cv, y_cv, x, y_pred, err_train, err_cv, optimal_reg_idx, lambda_range)

吴恩达471机器学习入门课程2第3周——评估和改进模型_第6张图片

上图显示,随着正则化的增加,模型从高方差(过度拟合)模型移动到高偏差(欠拟合)模型。右图中的垂直线显示了 lambda 的最佳值。在本例中,多项式次数设置为 10。

3.4 扩大数据集

这段代码定义了一个名为tune_m的函数。该函数用于调整训练样本数量以减少过拟合。

具体而言,该函数首先设定了一个训练样本数量的初始值m=50,然后生成一个范围数组m_range,其中包含了以m为步长、从1到15乘以m的一系列数。接下来,函数设定了一些变量,如待训练模型的阶数degree,以及记录训练误差和交叉验证误差的数组err_trainerr_cv

在主循环中,函数通过调用gen_data函数生成训练数据,并将其分成了三部分:训练集、交叉验证集和测试集。然后,函数创建了一个线性模型对象lmodel,并对其进行了训练。接着,函数使用该模型对训练集和交叉验证集进行预测,并计算出了MSE误差。最后,函数将该模型对一系列离散点进行的预测结果存储在了矩阵y_pred中。

最终,函数返回了一组元组,其中包含了训练集、交叉验证集、自变量x、预测结果y_pred以及训练误差和交叉验证误差等信息。

def tune_m():
    """ tune the number of examples to reduce overfitting """
    m = 50
    m_range = np.array(m*np.arange(1,16))
    num_steps = m_range.shape[0]
    degree = 16
    err_train = np.zeros(num_steps)
    err_cv = np.zeros(num_steps)
    y_pred = np.zeros((100,num_steps))

    for i in range(num_steps):
        X, y, y_ideal, x_ideal = gen_data(m_range[i],5,0.7)
        x = np.linspace(0,int(X.max()),100)
        X_train, X_, y_train, y_ = train_test_split(X,y,test_size=0.40, random_state=1)
        X_cv, X_test, y_cv, y_test = train_test_split(X_,y_,test_size=0.50, random_state=1)

        lmodel = lin_model(degree)  # no regularization
        lmodel.fit(X_train, y_train)
        yhat = lmodel.predict(X_train)
        err_train[i] = lmodel.mse(y_train, yhat)
        yhat = lmodel.predict(X_cv)
        err_cv[i] = lmodel.mse(y_cv, yhat)
        y_pred[:,i] = lmodel.predict(x)
    return(X_train, y_train, X_cv, y_cv, x, y_pred, err_train, err_cv, m_range,degree)
X_train, y_train, X_cv, y_cv, x, y_pred, err_train, err_cv, m_range,degree = tune_m()
plt_tune_m(X_train, y_train, X_cv, y_cv, x, y_pred, err_train, err_cv, m_range, degree)

吴恩达471机器学习入门课程2第3周——评估和改进模型_第7张图片

上图显示,当模型具有高方差且过度拟合时,添加更多示例可提高性能。请注意左侧图上的曲线。最大值为 m m m 的最终曲线是位于数据中心的平滑曲线。在右侧,随着示例数量的增加,训练集和交叉验证集的性能收敛到相似的值。请注意,曲线并不像在讲座中看到的那样平滑。这是意料之中的。趋势仍然很明显:更多的数据可以改善泛化。 > 请注意,当模型具有高偏差(欠拟合)时添加更多示例不会提高性能。

4 评估神经网络学习算法

上面,您调整了多项式回归模型的各个方面。在这里,您将使用神经网络模型。让我们从创建分类数据集开始。

4.1 数据集

生成数据集并将其拆分为训练集、交叉验证(CV)和测试集。在此示例中,增加了交叉验证数据点的百分比以进行强调。

'''
这段代码是用来进行数据集划分的,目的是将原始数据集(X和y)划分成训练集、交叉验证集和测试集三个部分。具体来说,首先使用train_test_split函数将原始数据集划分为两部分,分别是训练集(X_train和y_train)和剩余部分(X_和y_)。其中test_size=0.50表示将原始数据集的一半随机划分到训练集中,另一半划分到剩余部分中。

接下来,再次使用train_test_split函数将剩余部分划分为交叉验证集(X_cv和y_cv)和测试集(X_test和y_test),其中test_size=0.20表示将剩余部分的20%划分到交叉验证集中,剩下的80%划分到测试集中。同时,由于设置了random_state=1,因此每次运行程序所得到的划分结果都是相同的,这有助于保证实验的可重复性。
'''
X, y, centers, classes, std = gen_blobs()

# split the data. Large CV population for demonstration
X_train, X_, y_train, y_ = train_test_split(X,y,test_size=0.50, random_state=1)
X_cv, X_test, y_cv, y_test = train_test_split(X_,y_,test_size=0.20, random_state=1)
print("X_train.shape:", X_train.shape, "X_cv.shape:", X_cv.shape, "X_test.shape:", X_test.shape)
X_train.shape: (400, 2) X_cv.shape: (320, 2) X_test.shape: (80, 2)
plt_train_eq_dist(X_train, y_train,classes, X_cv, y_cv, centers, std)

吴恩达471机器学习入门课程2第3周——评估和改进模型_第8张图片

在左边的图中,您可以看到数据。颜色标识了六个聚类。同时显示了训练点(圆点)和交叉验证点(三角形)。有趣的点是那些位于模糊位置的点,任何一个聚类都可能认为它们是自己的成员。你会期望神经网络模型会做什么?什么情况下会出现过拟合?欠拟合?

右边是一个“理想”的模型示例,或者说是一个知道数据来源的模型。线条代表“等距离”边界,即中心点之间的距离相等。值得注意的是,这个模型会将大约8%的总数据集“误分类”。

4.2 通过计算分类错误率来评估分类模型

此处使用的分类模型的评估函数只是错误预测的分数:
J c v = 1 m ∑ i = 0 m − 1 { 1 , if  y ^ ( i ) ≠ y ( i ) 0 , otherwise J_{cv} =\frac{1}{m}\sum_{i=0}^{m-1} \begin{cases} 1, & \text{if $\hat{y}^{(i)} \neq y^{(i)}$}\\ 0, & \text{otherwise} \end{cases} Jcv=m1i=0m1{1,0,if y^(i)=y(i)otherwise

def eval_cat_err(y,yhat):
    m = len(y)
    incorrect = 0
    for i in range(m):
        if (y[i]!=yhat[i]):
            incorrect+=1
    cerr = incorrect/m
    return cerr

5 模型复杂度

下面,您将构建两个模型。一个复杂的模型和一个简单的模型。您将评估模型以确定它们是否可能过度拟合或拟合不足。

5.1 复杂模型

以下是一个三层模型的构建:

  • 具有120个神经元的全连接层,使用relu激活函数
  • 具有40个神经元的全连接层,使用relu激活函数
  • 具有6个神经元的全连接层,使用线性激活函数(不是softmax)

编译时,请使用以下配置:

  • 损失函数为 SparseCategoricalCrossentropy,请记得设置 from_logits=True
  • 优化器使用Adam,学习率为0.01。
tf.random.set_seed(1234)
model = Sequential(
    [
        tf.keras.layers.Dense(120,activation='relu'),
        tf.keras.layers.Dense(40,activation='relu'),
        tf.keras.layers.Dense(6,activation='linear'),
    ],name = 'Complex'
)
model.compile(
    loss = tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True),
    optimizer=tf.keras.optimizers.Adam(learning_rate=0.01)
)
model.fit(
    X_train,y_train,
    epochs=1000
)
Epoch 1/1000
13/13 [==============================] - 1s 3ms/step - loss: 1.0820
Epoch 2/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.4062
Epoch 3/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.3124
Epoch 4/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.2728
Epoch 5/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2614
Epoch 6/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.2789
Epoch 7/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2450
Epoch 8/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2270
Epoch 9/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.2345
Epoch 10/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1947
Epoch 11/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.2074
Epoch 12/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1998
Epoch 13/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.2367
Epoch 14/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.2219
Epoch 15/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.2132
Epoch 16/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2034
Epoch 17/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2007
Epoch 18/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1988
Epoch 19/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2167
Epoch 20/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2015
Epoch 21/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2212
Epoch 22/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2541
Epoch 23/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2044
Epoch 24/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1986
Epoch 25/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.2039
Epoch 26/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1983
Epoch 27/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1929
Epoch 28/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1998
Epoch 29/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.2280
Epoch 30/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2393
Epoch 31/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1929
Epoch 32/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1938
Epoch 33/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1929
Epoch 34/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2167
Epoch 35/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1994
Epoch 36/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1809
Epoch 37/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1869
Epoch 38/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1753
Epoch 39/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1974
Epoch 40/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1930
Epoch 41/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2137
Epoch 42/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1952
Epoch 43/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1973
Epoch 44/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2142
Epoch 45/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2237
Epoch 46/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1796
Epoch 47/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1916
Epoch 48/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1849
Epoch 49/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1802
Epoch 50/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1762
Epoch 51/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1961
Epoch 52/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1864
Epoch 53/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1953
Epoch 54/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1753
Epoch 55/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1760
Epoch 56/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1724
Epoch 57/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1878
Epoch 58/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1909
Epoch 59/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1864
Epoch 60/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1904
Epoch 61/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1866
Epoch 62/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1778
Epoch 63/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1668
Epoch 64/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.2029
Epoch 65/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1895
Epoch 66/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1847
Epoch 67/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1890
Epoch 68/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.2114
Epoch 69/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1916
Epoch 70/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1618
Epoch 71/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1825
Epoch 72/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1687
Epoch 73/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1707
Epoch 74/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1707
Epoch 75/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1610
Epoch 76/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1545
Epoch 77/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1599
Epoch 78/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1521
Epoch 79/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1615
Epoch 80/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1699
Epoch 81/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1884
Epoch 82/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1669
Epoch 83/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1577
Epoch 84/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1656
Epoch 85/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1628
Epoch 86/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1538
Epoch 87/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1571
Epoch 88/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1678
Epoch 89/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1640
Epoch 90/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1510
Epoch 91/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1452
Epoch 92/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1417
Epoch 93/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1405
Epoch 94/1000
13/13 [==============================] - 0s 4ms/step - loss: 0.1501
Epoch 95/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1593
Epoch 96/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1618
Epoch 97/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1663
Epoch 98/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1743
Epoch 99/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1745
Epoch 100/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1569
Epoch 101/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1514
Epoch 102/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1545
Epoch 103/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1681
Epoch 104/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1519
Epoch 105/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1417
Epoch 106/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1405
Epoch 107/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1451
Epoch 108/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1390
Epoch 109/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1457
Epoch 110/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1340
Epoch 111/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1318
Epoch 112/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1438
Epoch 113/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1419
Epoch 114/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1258
Epoch 115/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1312
Epoch 116/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1312
Epoch 117/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1325
Epoch 118/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1408
Epoch 119/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1534
Epoch 120/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1447
Epoch 121/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1426
Epoch 122/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1380
Epoch 123/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1336
Epoch 124/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1222
Epoch 125/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1202
Epoch 126/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1458
Epoch 127/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1260
Epoch 128/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1249
Epoch 129/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1242
Epoch 130/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1285
Epoch 131/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1170
Epoch 132/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1292
Epoch 133/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1236
Epoch 134/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1112
Epoch 135/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1281
Epoch 136/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1263
Epoch 137/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1335
Epoch 138/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1458
Epoch 139/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1330
Epoch 140/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1193
Epoch 141/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1143
Epoch 142/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1126
Epoch 143/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1157
Epoch 144/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1200
Epoch 145/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1192
Epoch 146/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1301
Epoch 147/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1241
Epoch 148/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1328
Epoch 149/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1291
Epoch 150/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1173
Epoch 151/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1220
Epoch 152/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1281
Epoch 153/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1329
Epoch 154/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1299
Epoch 155/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1148
Epoch 156/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1152
Epoch 157/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1232
Epoch 158/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1179
Epoch 159/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0969
Epoch 160/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1135
Epoch 161/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1109
Epoch 162/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1160
Epoch 163/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1099
Epoch 164/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1378
Epoch 165/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1351
Epoch 166/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1496
Epoch 167/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2475
Epoch 168/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.2054
Epoch 169/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1470
Epoch 170/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1279
Epoch 171/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1195
Epoch 172/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1205
Epoch 173/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1141
Epoch 174/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1169
Epoch 175/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1007
Epoch 176/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1169
Epoch 177/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1015
Epoch 178/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1072
Epoch 179/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1211
Epoch 180/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1032
Epoch 181/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1228
Epoch 182/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1179
Epoch 183/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1209
Epoch 184/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1031
Epoch 185/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0941
Epoch 186/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0879
Epoch 187/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0902
Epoch 188/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0875
Epoch 189/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0872
Epoch 190/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0987
Epoch 191/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0972
Epoch 192/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1012
Epoch 193/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0905
Epoch 194/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0839
Epoch 195/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0835
Epoch 196/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0875
Epoch 197/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0886
Epoch 198/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0893
Epoch 199/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0934
Epoch 200/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1029
Epoch 201/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0936
Epoch 202/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0946
Epoch 203/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0849
Epoch 204/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0855
Epoch 205/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1102
Epoch 206/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1163
Epoch 207/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1048
Epoch 208/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1127
Epoch 209/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1044
Epoch 210/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0908
Epoch 211/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0923
Epoch 212/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0831
Epoch 213/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0840
Epoch 214/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0897
Epoch 215/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0799
Epoch 216/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0882
Epoch 217/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1073
Epoch 218/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0910
Epoch 219/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1184
Epoch 220/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0992
Epoch 221/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0924
Epoch 222/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.0983
Epoch 223/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1146
Epoch 224/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0903
Epoch 225/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0994
Epoch 226/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1001
Epoch 227/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0862
Epoch 228/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0822
Epoch 229/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0816
Epoch 230/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0827
Epoch 231/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1146
Epoch 232/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0880
Epoch 233/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0983
Epoch 234/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0795
Epoch 235/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1035
Epoch 236/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1091
Epoch 237/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0994
Epoch 238/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1125
Epoch 239/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0927
Epoch 240/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0931
Epoch 241/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0880
Epoch 242/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1191
Epoch 243/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1188
Epoch 244/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1176
Epoch 245/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1286
Epoch 246/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1065
Epoch 247/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0959
Epoch 248/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0856
Epoch 249/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0818
Epoch 250/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0799
Epoch 251/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0769
Epoch 252/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0689
Epoch 253/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0731
Epoch 254/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0714
Epoch 255/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0829
Epoch 256/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.0715
Epoch 257/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0765
Epoch 258/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0822
Epoch 259/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0829
Epoch 260/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1046
Epoch 261/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1028
Epoch 262/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.0856
Epoch 263/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0734
Epoch 264/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0695
Epoch 265/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0759
Epoch 266/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0810
Epoch 267/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0707
Epoch 268/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0665
Epoch 269/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0681
Epoch 270/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0771
Epoch 271/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0940
Epoch 272/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1002
Epoch 273/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0830
Epoch 274/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0955
Epoch 275/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1003
Epoch 276/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0970
Epoch 277/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1518
Epoch 278/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1788
Epoch 279/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1576
Epoch 280/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1112
Epoch 281/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0920
Epoch 282/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0859
Epoch 283/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0876
Epoch 284/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0837
Epoch 285/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0831
Epoch 286/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1124
Epoch 287/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1161
Epoch 288/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1011
Epoch 289/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0751
Epoch 290/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0798
Epoch 291/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0725
Epoch 292/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0741
Epoch 293/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0676
Epoch 294/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0660
Epoch 295/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0793
Epoch 296/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0708
Epoch 297/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0703
Epoch 298/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0670
Epoch 299/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0801
Epoch 300/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0644
Epoch 301/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0588
Epoch 302/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0680
Epoch 303/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0711
Epoch 304/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0603
Epoch 305/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0656
Epoch 306/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0626
Epoch 307/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0636
Epoch 308/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0592
Epoch 309/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0539
Epoch 310/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0601
Epoch 311/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0593
Epoch 312/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0584
Epoch 313/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0561
Epoch 314/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0548
Epoch 315/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0630
Epoch 316/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0624
Epoch 317/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0632
Epoch 318/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0568
Epoch 319/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0668
Epoch 320/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1035
Epoch 321/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0941
Epoch 322/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1077
Epoch 323/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0956
Epoch 324/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0744
Epoch 325/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0663
Epoch 326/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.0618
Epoch 327/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0608
Epoch 328/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0748
Epoch 329/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0680
Epoch 330/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0603
Epoch 331/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0590
Epoch 332/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0782
Epoch 333/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0779
Epoch 334/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0767
Epoch 335/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1017
Epoch 336/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0769
Epoch 337/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1076
Epoch 338/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0908
Epoch 339/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0672
Epoch 340/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0630
Epoch 341/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0593
Epoch 342/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0597
Epoch 343/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0497
Epoch 344/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0487
Epoch 345/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0495
Epoch 346/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0486
Epoch 347/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0480
Epoch 348/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.0461
Epoch 349/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0437
Epoch 350/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0448
Epoch 351/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0481
Epoch 352/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0594
Epoch 353/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0579
Epoch 354/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0813
Epoch 355/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0707
Epoch 356/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0615
Epoch 357/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.0584
Epoch 358/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.0667
Epoch 359/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0538
Epoch 360/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0827
Epoch 361/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0907
Epoch 362/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0938
Epoch 363/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0693
Epoch 364/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1089
Epoch 365/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0943
Epoch 366/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1131
Epoch 367/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0793
Epoch 368/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0981
Epoch 369/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0769
Epoch 370/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0723
Epoch 371/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0769
Epoch 372/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1027
Epoch 373/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0809
Epoch 374/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0671
Epoch 375/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0576
Epoch 376/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0931
Epoch 377/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0939
Epoch 378/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0676
Epoch 379/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0680
Epoch 380/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0549
Epoch 381/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0649
Epoch 382/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0574
Epoch 383/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0596
Epoch 384/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0425
Epoch 385/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0475
Epoch 386/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0399
Epoch 387/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0467
Epoch 388/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0468
Epoch 389/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0422
Epoch 390/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0420
Epoch 391/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0390
Epoch 392/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0406
Epoch 393/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0403
Epoch 394/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0395
Epoch 395/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0429
Epoch 396/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0378
Epoch 397/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0367
Epoch 398/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0464
Epoch 399/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0377
Epoch 400/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0538
Epoch 401/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0830
Epoch 402/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0985
Epoch 403/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0636
Epoch 404/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0510
Epoch 405/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0526
Epoch 406/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0441
Epoch 407/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0465
Epoch 408/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0400
Epoch 409/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0398
Epoch 410/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0402
Epoch 411/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0395
Epoch 412/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0404
Epoch 413/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0383
Epoch 414/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0346
Epoch 415/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0373
Epoch 416/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0349
Epoch 417/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0348
Epoch 418/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0366
Epoch 419/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0378
Epoch 420/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0402
Epoch 421/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0360
Epoch 422/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0355
Epoch 423/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0378
Epoch 424/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0324
Epoch 425/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0324
Epoch 426/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0335
Epoch 427/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0316
Epoch 428/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0414
Epoch 429/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0358
Epoch 430/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0395
Epoch 431/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0364
Epoch 432/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0430
Epoch 433/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0712
Epoch 434/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0448
Epoch 435/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0523
Epoch 436/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0962
Epoch 437/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1010
Epoch 438/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0843
Epoch 439/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0781
Epoch 440/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0588
Epoch 441/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0511
Epoch 442/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0532
Epoch 443/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0488
Epoch 444/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0443
Epoch 445/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0436
Epoch 446/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0457
Epoch 447/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0430
Epoch 448/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0610
Epoch 449/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0584
Epoch 450/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0797
Epoch 451/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0719
Epoch 452/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0908
Epoch 453/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0822
Epoch 454/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0889
Epoch 455/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1011
Epoch 456/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0920
Epoch 457/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0956
Epoch 458/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0564
Epoch 459/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0517
Epoch 460/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0432
Epoch 461/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0414
Epoch 462/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0398
Epoch 463/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0358
Epoch 464/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0408
Epoch 465/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0341
Epoch 466/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0400
Epoch 467/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0662
Epoch 468/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0450
Epoch 469/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0379
Epoch 470/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0475
Epoch 471/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0416
Epoch 472/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0336
Epoch 473/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0347
Epoch 474/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.0321
Epoch 475/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0354
Epoch 476/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0333
Epoch 477/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0370
Epoch 478/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0375
Epoch 479/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0359
Epoch 480/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0366
Epoch 481/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0331
Epoch 482/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0316
Epoch 483/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0318
Epoch 484/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0332
Epoch 485/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0344
Epoch 486/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0403
Epoch 487/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0343
Epoch 488/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0326
Epoch 489/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0355
Epoch 490/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0316
Epoch 491/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0303
Epoch 492/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0298
Epoch 493/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0346
Epoch 494/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0302
Epoch 495/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0306
Epoch 496/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0342
Epoch 497/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.0486
Epoch 498/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0355
Epoch 499/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0413
Epoch 500/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0378
Epoch 501/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.0414
Epoch 502/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0439
Epoch 503/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0597
Epoch 504/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0671
Epoch 505/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0948
Epoch 506/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1687
Epoch 507/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1864
Epoch 508/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1533
Epoch 509/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0954
Epoch 510/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1054
Epoch 511/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0796
Epoch 512/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.0633
Epoch 513/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0621
Epoch 514/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0564
Epoch 515/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0677
Epoch 516/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0582
Epoch 517/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0536
Epoch 518/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0551
Epoch 519/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0542
Epoch 520/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0653
Epoch 521/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0723
Epoch 522/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0547
Epoch 523/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0441
Epoch 524/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0465
Epoch 525/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0457
Epoch 526/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0417
Epoch 527/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0444
Epoch 528/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0377
Epoch 529/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0422
Epoch 530/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0389
Epoch 531/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0364
Epoch 532/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0397
Epoch 533/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0375
Epoch 534/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0359
Epoch 535/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0451
Epoch 536/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0464
Epoch 537/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0461
Epoch 538/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0358
Epoch 539/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0299
Epoch 540/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0339
Epoch 541/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0322
Epoch 542/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0339
Epoch 543/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0339
Epoch 544/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0350
Epoch 545/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0403
Epoch 546/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0398
Epoch 547/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0354
Epoch 548/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.0343
Epoch 549/1000
 1/13 [=>............................] - ETA: 0s - loss: 0.0211
model.summary()
Model: "Complex"
_________________________________________________________________
 Layer (type)                Output Shape              Param #   
=================================================================
 dense (Dense)               (None, 120)               360       
                                                                 
 dense_1 (Dense)             (None, 40)                4840      
                                                                 
 dense_2 (Dense)             (None, 6)                 246       
                                                                 
=================================================================
Total params: 5,446
Trainable params: 5,446
Non-trainable params: 0
_________________________________________________________________
model_predict = lambda Xl: np.argmax(tf.nn.softmax(model.predict(Xl)).numpy(),axis=1)
plt_nn(model_predict,X_train,y_train, classes, X_cv, y_cv, suptitle="Complex Model")
1082/1082 [==============================] - 2s 2ms/step
1082/1082 [==============================] - 2s 1ms/step
Figure

该模型非常努力地捕获每个类别的异常值。因此,它错误地分类了一些交叉验证数据。让我们计算分类误差。

training_cerr_complex = eval_cat_err(y_train, model_predict(X_train))
cv_cerr_complex = eval_cat_err(y_cv, model_predict(X_cv))
print(f"categorization error, training, complex model: {training_cerr_complex:0.3f}")
print(f"categorization error, cv,       complex model: {cv_cerr_complex:0.3f}")
13/13 [==============================] - 0s 2ms/step
10/10 [==============================] - 0s 2ms/step
categorization error, training, complex model: 0.015
categorization error, cv,       complex model: 0.113

5.2 简单模型

现在,让我们尝试一个简单的模型

请编写一个两层的模型:

  • 具有 6 个单元的稠密层,激活函数为 relu
  • 具有 6 个单元和线性激活函数的稠密层。
    使用以下参数进行编译:
  • 损失函数为 SparseCategoricalCrossentropy,记得使用 from_logits=True
  • Adam 优化器,学习率为 0.01。
tf.random.set_seed(1234)
model_s = Sequential(
    [
        ### START CODE HERE ###
        tf.keras.layers.Dense(6, activation="relu"),
        tf.keras.layers.Dense(6, activation="linear")
        ### END CODE HERE ###
    ], name = "Simple"
)
model_s.compile(
    ### START CODE HERE ###
    loss=SparseCategoricalCrossentropy(from_logits=True),
    optimizer=tf.keras.optimizers.Adam(learning_rate=0.01),
    ### START CODE HERE ###
)
import logging
logging.getLogger("tensorflow").setLevel(logging.ERROR)

# BEGIN UNIT TEST
model_s.fit(
    X_train,y_train,
    epochs=1000
)
# END UNIT TEST
Epoch 1/1000
13/13 [==============================] - 1s 3ms/step - loss: 1.9489
Epoch 2/1000
13/13 [==============================] - 0s 3ms/step - loss: 1.6099
Epoch 3/1000
13/13 [==============================] - 0s 3ms/step - loss: 1.3808
Epoch 4/1000
13/13 [==============================] - 0s 3ms/step - loss: 1.1772
Epoch 5/1000
13/13 [==============================] - 0s 3ms/step - loss: 1.0024
Epoch 6/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.8548
Epoch 7/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.7436
Epoch 8/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.6449
Epoch 9/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.5693
Epoch 10/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.5074
Epoch 11/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.4616
Epoch 12/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.4274
Epoch 13/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.4031
Epoch 14/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.3762
Epoch 15/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.3567
Epoch 16/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.3409
Epoch 17/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.3251
Epoch 18/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.3105
Epoch 19/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2961
Epoch 20/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2817
Epoch 21/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2689
Epoch 22/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2580
Epoch 23/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2522
Epoch 24/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2439
Epoch 25/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2427
Epoch 26/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2378
Epoch 27/1000
13/13 [==============================] - 0s 5ms/step - loss: 0.2332
Epoch 28/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2335
Epoch 29/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2349
Epoch 30/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2265
Epoch 31/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2255
Epoch 32/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2231
Epoch 33/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2228
Epoch 34/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2205
Epoch 35/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2190
Epoch 36/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2176
Epoch 37/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2164
Epoch 38/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2145
Epoch 39/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2163
Epoch 40/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2149
Epoch 41/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2146
Epoch 42/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2143
Epoch 43/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2128
Epoch 44/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2142
Epoch 45/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2138
Epoch 46/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2109
Epoch 47/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2101
Epoch 48/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2071
Epoch 49/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2088
Epoch 50/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2103
Epoch 51/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2097
Epoch 52/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2068
Epoch 53/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2062
Epoch 54/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2081
Epoch 55/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2053
Epoch 56/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2068
Epoch 57/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2080
Epoch 58/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.2044
Epoch 59/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.2080
Epoch 60/1000
13/13 [==============================] - 0s 4ms/step - loss: 0.2071
Epoch 61/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.2055
Epoch 62/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2054
Epoch 63/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2083
Epoch 64/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2060
Epoch 65/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2099
Epoch 66/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2062
Epoch 67/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2080
Epoch 68/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.2047
Epoch 69/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.2058
Epoch 70/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2030
Epoch 71/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.2030
Epoch 72/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.2046
Epoch 73/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2055
Epoch 74/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2065
Epoch 75/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2027
Epoch 76/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2024
Epoch 77/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2045
Epoch 78/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2012
Epoch 79/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2040
Epoch 80/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2045
Epoch 81/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1990
Epoch 82/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2024
Epoch 83/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2019
Epoch 84/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2031
Epoch 85/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2043
Epoch 86/1000
13/13 [==============================] - 0s 6ms/step - loss: 0.2064
Epoch 87/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2014
Epoch 88/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2021
Epoch 89/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2009
Epoch 90/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2001
Epoch 91/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1999
Epoch 92/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2009
Epoch 93/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1998
Epoch 94/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2014
Epoch 95/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2021
Epoch 96/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2040
Epoch 97/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2011
Epoch 98/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2046
Epoch 99/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1996
Epoch 100/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2023
Epoch 101/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2009
Epoch 102/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2020
Epoch 103/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2048
Epoch 104/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2005
Epoch 105/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1989
Epoch 106/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1995
Epoch 107/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1976
Epoch 108/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1983
Epoch 109/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1999
Epoch 110/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1999
Epoch 111/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1981
Epoch 112/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1999
Epoch 113/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1971
Epoch 114/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2005
Epoch 115/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1969
Epoch 116/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2022
Epoch 117/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1968
Epoch 118/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1994
Epoch 119/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1978
Epoch 120/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1988
Epoch 121/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1982
Epoch 122/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1993
Epoch 123/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1983
Epoch 124/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1994
Epoch 125/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1990
Epoch 126/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1990
Epoch 127/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1981
Epoch 128/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1981
Epoch 129/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1961
Epoch 130/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1992
Epoch 131/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1965
Epoch 132/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1951
Epoch 133/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1975
Epoch 134/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1979
Epoch 135/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.2011
Epoch 136/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1966
Epoch 137/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1958
Epoch 138/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1986
Epoch 139/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1971
Epoch 140/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1961
Epoch 141/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1958
Epoch 142/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1956
Epoch 143/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1980
Epoch 144/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1977
Epoch 145/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1960
Epoch 146/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1962
Epoch 147/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1968
Epoch 148/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1946
Epoch 149/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1956
Epoch 150/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1955
Epoch 151/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1964
Epoch 152/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1964
Epoch 153/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1975
Epoch 154/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1955
Epoch 155/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1954
Epoch 156/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1946
Epoch 157/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1959
Epoch 158/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1951
Epoch 159/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1950
Epoch 160/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1963
Epoch 161/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1983
Epoch 162/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1972
Epoch 163/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1981
Epoch 164/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1946
Epoch 165/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1945
Epoch 166/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1940
Epoch 167/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1960
Epoch 168/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1968
Epoch 169/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1946
Epoch 170/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1965
Epoch 171/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1933
Epoch 172/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1926
Epoch 173/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1966
Epoch 174/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1940
Epoch 175/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1921
Epoch 176/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1921
Epoch 177/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1968
Epoch 178/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1962
Epoch 179/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1935
Epoch 180/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1929
Epoch 181/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1952
Epoch 182/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1929
Epoch 183/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1925
Epoch 184/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1929
Epoch 185/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1925
Epoch 186/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1925
Epoch 187/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1910
Epoch 188/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1933
Epoch 189/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1947
Epoch 190/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1944
Epoch 191/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1927
Epoch 192/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1927
Epoch 193/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1950
Epoch 194/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1910
Epoch 195/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1911
Epoch 196/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1912
Epoch 197/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1927
Epoch 198/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1939
Epoch 199/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1955
Epoch 200/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1917
Epoch 201/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1919
Epoch 202/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1902
Epoch 203/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1914
Epoch 204/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1919
Epoch 205/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1902
Epoch 206/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1920
Epoch 207/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1900
Epoch 208/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1937
Epoch 209/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1898
Epoch 210/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1924
Epoch 211/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1908
Epoch 212/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1910
Epoch 213/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1900
Epoch 214/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1928
Epoch 215/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1908
Epoch 216/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1920
Epoch 217/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1888
Epoch 218/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1932
Epoch 219/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1897
Epoch 220/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1902
Epoch 221/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1896
Epoch 222/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1902
Epoch 223/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1924
Epoch 224/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1923
Epoch 225/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1904
Epoch 226/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1886
Epoch 227/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1874
Epoch 228/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1903
Epoch 229/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1931
Epoch 230/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1897
Epoch 231/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1890
Epoch 232/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1929
Epoch 233/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1890
Epoch 234/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1911
Epoch 235/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1875
Epoch 236/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1886
Epoch 237/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1913
Epoch 238/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1943
Epoch 239/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1900
Epoch 240/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1868
Epoch 241/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1887
Epoch 242/1000
13/13 [==============================] - 0s 5ms/step - loss: 0.1988
Epoch 243/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1942
Epoch 244/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1905
Epoch 245/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1864
Epoch 246/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1891
Epoch 247/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1867
Epoch 248/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1865
Epoch 249/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1890
Epoch 250/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1865
Epoch 251/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1868
Epoch 252/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1861
Epoch 253/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1864
Epoch 254/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1879
Epoch 255/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1877
Epoch 256/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1868
Epoch 257/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1874
Epoch 258/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1878
Epoch 259/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1829
Epoch 260/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1898
Epoch 261/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1849
Epoch 262/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1861
Epoch 263/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1868
Epoch 264/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1853
Epoch 265/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1859
Epoch 266/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1846
Epoch 267/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1869
Epoch 268/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1849
Epoch 269/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1856
Epoch 270/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1860
Epoch 271/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1839
Epoch 272/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1910
Epoch 273/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1885
Epoch 274/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1845
Epoch 275/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1877
Epoch 276/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1818
Epoch 277/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1872
Epoch 278/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1827
Epoch 279/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1897
Epoch 280/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1849
Epoch 281/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1828
Epoch 282/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1846
Epoch 283/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1853
Epoch 284/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1851
Epoch 285/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1842
Epoch 286/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1849
Epoch 287/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1864
Epoch 288/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1889
Epoch 289/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1839
Epoch 290/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1837
Epoch 291/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1829
Epoch 292/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1821
Epoch 293/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1842
Epoch 294/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1814
Epoch 295/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1890
Epoch 296/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1832
Epoch 297/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1875
Epoch 298/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1851
Epoch 299/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1829
Epoch 300/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1842
Epoch 301/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1835
Epoch 302/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1868
Epoch 303/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1824
Epoch 304/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1908
Epoch 305/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1832
Epoch 306/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1833
Epoch 307/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1836
Epoch 308/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1826
Epoch 309/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1848
Epoch 310/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1865
Epoch 311/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1820
Epoch 312/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1893
Epoch 313/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1800
Epoch 314/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1843
Epoch 315/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1827
Epoch 316/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1812
Epoch 317/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1833
Epoch 318/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1847
Epoch 319/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1822
Epoch 320/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1854
Epoch 321/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1820
Epoch 322/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1854
Epoch 323/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1825
Epoch 324/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1834
Epoch 325/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1821
Epoch 326/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1807
Epoch 327/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1824
Epoch 328/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1864
Epoch 329/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1817
Epoch 330/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1846
Epoch 331/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1832
Epoch 332/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1811
Epoch 333/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1831
Epoch 334/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1840
Epoch 335/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1823
Epoch 336/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1808
Epoch 337/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1827
Epoch 338/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1833
Epoch 339/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1824
Epoch 340/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1817
Epoch 341/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1831
Epoch 342/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1806
Epoch 343/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1804
Epoch 344/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1814
Epoch 345/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1839
Epoch 346/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1818
Epoch 347/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1805
Epoch 348/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1825
Epoch 349/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1814
Epoch 350/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1805
Epoch 351/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1813
Epoch 352/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1845
Epoch 353/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1830
Epoch 354/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1811
Epoch 355/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1807
Epoch 356/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1804
Epoch 357/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1810
Epoch 358/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1812
Epoch 359/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1827
Epoch 360/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1839
Epoch 361/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1821
Epoch 362/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1837
Epoch 363/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1840
Epoch 364/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1811
Epoch 365/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1822
Epoch 366/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1812
Epoch 367/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1804
Epoch 368/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1810
Epoch 369/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1834
Epoch 370/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1784
Epoch 371/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1812
Epoch 372/1000
13/13 [==============================] - 0s 3ms/step - loss: 0.1797
Epoch 373/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1793
Epoch 374/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1791
Epoch 375/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1798
Epoch 376/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1791
Epoch 377/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1792
Epoch 378/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1818
Epoch 379/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1813
Epoch 380/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1794
Epoch 381/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1817
Epoch 382/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1811
Epoch 383/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1873
Epoch 384/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1781
Epoch 385/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1814
Epoch 386/1000
13/13 [==============================] - 0s 5ms/step - loss: 0.1797
Epoch 387/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1787
Epoch 388/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1829
Epoch 389/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1823
Epoch 390/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1817
Epoch 391/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1796
Epoch 392/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1790
Epoch 393/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1782
Epoch 394/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1783
Epoch 395/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1778
Epoch 396/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1808
Epoch 397/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1819
Epoch 398/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1797
Epoch 399/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1797
Epoch 400/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1814
Epoch 401/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1792
Epoch 402/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1796
Epoch 403/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1788
Epoch 404/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1792
Epoch 405/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1777
Epoch 406/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1781
Epoch 407/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1796
Epoch 408/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1810
Epoch 409/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1764
Epoch 410/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1824
Epoch 411/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1808
Epoch 412/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1797
Epoch 413/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1792
Epoch 414/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1778
Epoch 415/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1782
Epoch 416/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1804
Epoch 417/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1774
Epoch 418/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1788
Epoch 419/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1794
Epoch 420/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1814
Epoch 421/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1774
Epoch 422/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1788
Epoch 423/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1797
Epoch 424/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1783
Epoch 425/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1814
Epoch 426/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1804
Epoch 427/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1774
Epoch 428/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1773
Epoch 429/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1782
Epoch 430/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1796
Epoch 431/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1778
Epoch 432/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1784
Epoch 433/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1806
Epoch 434/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1795
Epoch 435/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1800
Epoch 436/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1802
Epoch 437/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1796
Epoch 438/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1786
Epoch 439/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1776
Epoch 440/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1769
Epoch 441/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1798
Epoch 442/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1805
Epoch 443/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1783
Epoch 444/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1797
Epoch 445/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1787
Epoch 446/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1792
Epoch 447/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1801
Epoch 448/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1767
Epoch 449/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1760
Epoch 450/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1780
Epoch 451/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1779
Epoch 452/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1783
Epoch 453/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1775
Epoch 454/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1774
Epoch 455/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1767
Epoch 456/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1774
Epoch 457/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1789
Epoch 458/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1773
Epoch 459/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1780
Epoch 460/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1796
Epoch 461/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1772
Epoch 462/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1774
Epoch 463/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1792
Epoch 464/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1779
Epoch 465/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1814
Epoch 466/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1771
Epoch 467/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1796
Epoch 468/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1777
Epoch 469/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1783
Epoch 470/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1793
Epoch 471/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1812
Epoch 472/1000
13/13 [==============================] - 0s 2ms/step - loss: 0.1763
Epoch 473/1000
13/13 [=============

你可能感兴趣的:(机器学习,机器学习,人工智能)