使用MNIST数据集训练手写数字识别模型

使用MNIST数据集训练手写数字识别模型

写在前面:这是笔者在学习数字识别模型时参考前辈博客【神经网络与深度学习】使用MNIST数据集训练手写数字识别模型——[附完整训练代码]过程中所遇到的问题与解答。

问题一

原代码:

mnist = tf.keras.datasets.mnist
(train_x,train_y),(test_x,test_y) = mnist.load_data()

报错:

urllib.error.URLError: 

解决:
直接下载文件:MNIST数据集
如下图所示:使用MNIST数据集训练手写数字识别模型_第1张图片
下载后无需解压,直接放在某文件夹中,如下图所示
使用MNIST数据集训练手写数字识别模型_第2张图片
注释源代码,新增以下代码:

# 加载数据
def load_data(path, files):
    paths = [path + each for each in files]
    with gzip.open(paths[0], 'rb') as lbpath:
        train_labels = np.frombuffer(lbpath.read(), np.uint8, offset=8)
    with gzip.open(paths[1], 'rb') as imgpath:
        train_images = np.frombuffer(imgpath.read(), np.uint8, offset=16).reshape(len(train_labels),28,28)
    with gzip.open(paths[2], 'rb') as lbpath:
        test_labels = np.frombuffer(lbpath.read(), np.uint8, offset=8)
    with gzip.open(paths[3], 'rb') as imgpath:
        test_images = np.frombuffer(imgpath.read(), np.uint8, offset=16).reshape(len(test_labels),28,28)
    return (train_images, train_labels), (test_images, test_labels)


path = r'Q:\t00620\Downloads\fashion-mnist/'
files = ['train-labels-idx1-ubyte.gz','train-images-idx3-ubyte.gz','t10k-labels-idx1-ubyte.gz','t10k-images-idx3-ubyte.gz']
(train_x,train_y),(test_x,test_y) = load_data(path, files)

问题二

源代码:

#归一化、并转换为tensor张量,数据类型为float32.
X_train,X_test = tf.cast(train_x/255.0,tf.float32),tf.cast(test_x/255.0,tf.float32)     
y_train,y_test = tf.cast(train_y,tf.int16),tf.cast(test_y,tf.int16)

报错:

ValueError: When using data tensors as input to a model, you should specify the `steps_per_epoch` argument.

解决:
注释源代码,并新增以下代码

# 归一化,并转换数据类型
X_train = train_x/255.0
X_train = X_train.astype(np.float32)

X_test = test_x/255.0
X_test = X_test.astype(np.float32)

y_train = train_y.astype(np.int16)
y_test = test_y.astype(np.int16)

问题三

源代码:

demo = tf.reshape(X_test[num],(1,28,28))

报错:

ValueError: When using data tensors as input to a model, you should specify the `steps` argument.

解决:
修改该代码为

demo = np.reshape(X_test[num],(1,28,28))

至此,该代码即可顺利执行,再次感谢前辈“善良995”的博客。

你可能感兴趣的:(深度学习,深度学习,神经网络,keras)