五、神经网络过拟合处理方法(二):手写数字识别(Mnist)数据集

简介: 五、神经网络过拟合处理方法(二):手写数字识别(Mnist)数据集

神经网络过拟合处理方法

手写数字识别(Mnist)数据集

Mnist数据集是一个手写数字识别数据集,被称为深度学习界的“Hello World”。

在这里插入图片描述

Mnist数据集包含:

  • 训练集:60,000张28×28灰度图
  • 测试集:10,000张28×28灰度图

共有0~9这10个手写数字体类别。

导入必要的模块

import tensorflow as tf
from tensorflow.keras import datasets, regularizers, Input, Model
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Flatten, Dense, Dropout, Activation
from tensorflow.keras.callbacks import TensorBoard, EarlyStopping

import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
import time
import os

载入Mnist数据集

(x_train, y_train), (x_test, y_test) = datasets.mnist.load_data()
print(x_train.shape)
print(y_train.shape)
print(x_test.shape)
print(y_test.shape)
(60000, 28, 28)
(60000,)
(10000, 28, 28)
(10000,)

查看数据集图片

定义画图函数plot_images

参数介绍:

  • images:包含多张图片数据的序列。
  • labels:包含图片对应标签的序列(序列中的元素需要是0,1,2,...,9这样的正整数)。
def plot_images(images, labels):
    class_names = ['0', '1', '2', '3', '4', '5', '6', '7', '8', '9']
    fig, axes = plt.subplots(3, 10, figsize=(30,8))
    axes = axes.flatten()
    for img, label, ax in zip(images, labels, axes):
        ax.imshow(img)
        ax.set_title(class_names[label])
        ax.axis('off')
    plt.tight_layout()
    plt.show()

随机抽取30张训练集图片进行查看

np.random.seed(99)
index_list = np.random.randint(0,59999,30)

plot_images(x_train[index_list], y_train[index_list])

随机抽取30张测试集图片进行查看

在这里插入图片描述

np.random.seed(99)
index_list = np.random.randint(0,9999,30)

plot_images(x_test[index_list], y_test[index_list])

在这里插入图片描述

数据预处理

图片都除以255,也就是归一化

x_train = x_train/255.0
x_test = x_test/255.0

为了达到过拟合的效果,选取1000张图片作为训练集。

np.random.seed(43)
index_list = np.random.randint(0,59999,1000)

x_train = x_train[index_list] 
y_train = y_train[index_list]

模型搭建

用到的api:

展平层tf.keras.layers.Flatten

在这里插入图片描述

全连接层tf.keras.layers.Dense

用到的参数:

  • units:输入整数,全连接层神经元个数。
  • activation:激活函数。

    可选项:

    • 'sigmoid':sigmoid激活函数

    • 'tanh':tanh激活函数

    • 'relu':relu激活函数

    • 'elu'或tf.keras.activations.elu(alpha=1.0):elu激活函数

    • 'selu':selu激活函数

    • 'swish': swish激活函数(tf2.2版本以上才有)

    • 'softmax': softmax函数

  • kernel_initializer:权重初始化,默认是'glorot_uniform'(即Xavier均匀初始化)。

    可选项:

    • 'RandomNormal':正态分布采样,均值为0,标准差0.05

    • 'glorot_normal':正态分布采样,均值为0,标准差stddev = sqrt(2 / (fan_in + fan_out))

    • 'glorot_uniform':均匀分布采样,范围[-limit, limit],标准差limit = sqrt(6 / (fan_in + fan_out))

    • 'lecun_normal':正态分布采样,均值为0,标准差stddev = sqrt(1 / fan_in)

    • 'lecun_uniform':均匀分布采样,范围[-limit, limit],标准差limit = sqrt(3 / fan_in)

    • 'he_normal':正态分布采样,均值为0,标准差stddev = sqrt(2 / fan_in)

    • 'he_uniform':均匀分布采样,范围[-limit, limit],标准差limit = sqrt(6 / fan_in)

    fan_in是输入的神经元个数,fan_out是输出的神经元个数。

  • kernel_regularizer: 正则项方法,一般使用tf.keras.regularizers.l2。

  • name:输入字符串,给该层设置一个名称。

模型设置tf.keras.Sequential.compile

用到的参数:

  • loss:损失函数,分类任务一般使用"sparse_categorical_crossentropy"。

优化器tf.keras.optimizers.SGD

用到的参数:

  • learning_rate:学习率,默认为0.01。

搭建神经网络

  • 网络结构:20个隐层,每个隐层有500个神经元
  • 激活函数:tanh
  • 初始化:lecun_normal
  • 损失函数:交叉熵损失
  • 优化器:SGD,学习率0.001
  • 评价指标:accuracy
model = Sequential()

# 展平层flatten
model.add(Flatten(input_shape=(28, 28), name='flatten'))

# 隐层dense
for i in range(20):
    model.add(Dense(units=512, activation='tanh',
                    kernel_initializer='lecun_normal'))

# 加正则的隐层dense
for i in range(20):
    model.add(Dense(units=512, activation='tanh', kernel_initializer='lecun_normal',
                    kernel_regularizer=tf.keras.regularizers.l2(1e-5)))

# dropout层
# model.add(Dropout(rate=0.5))

# 输出层
model.add(Dense(units=10, activation='softmax', name='logit'))

# 设置损失函数loss、优化器optimizer、评价标准metrics
model.compile(loss='sparse_categorical_crossentropy', optimizer=tf.keras.optimizers.SGD(
    learning_rate=0.001), metrics=['accuracy'])

查看模型每层的参数量和输出的大小

model.summary()
Model: "sequential"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
flatten (Flatten)            (None, 784)               0         
_________________________________________________________________
dense (Dense)                (None, 512)               401920    
_________________________________________________________________
dense_1 (Dense)              (None, 512)               262656    
_________________________________________________________________
dense_2 (Dense)              (None, 512)               262656    
_________________________________________________________________
dense_3 (Dense)              (None, 512)               262656    
_________________________________________________________________
dense_4 (Dense)              (None, 512)               262656    
_________________________________________________________________
dense_5 (Dense)              (None, 512)               262656    
_________________________________________________________________
dense_6 (Dense)              (None, 512)               262656    
_________________________________________________________________
dense_7 (Dense)              (None, 512)               262656    
_________________________________________________________________
dense_8 (Dense)              (None, 512)               262656    
_________________________________________________________________
dense_9 (Dense)              (None, 512)               262656    
_________________________________________________________________
dense_10 (Dense)             (None, 512)               262656    
_________________________________________________________________
dense_11 (Dense)             (None, 512)               262656    
_________________________________________________________________
dense_12 (Dense)             (None, 512)               262656    
_________________________________________________________________
dense_13 (Dense)             (None, 512)               262656    
_________________________________________________________________
dense_14 (Dense)             (None, 512)               262656    
_________________________________________________________________
dense_15 (Dense)             (None, 512)               262656    
_________________________________________________________________
dense_16 (Dense)             (None, 512)               262656    
_________________________________________________________________
dense_17 (Dense)             (None, 512)               262656    
_________________________________________________________________
dense_18 (Dense)             (None, 512)               262656    
_________________________________________________________________
dense_19 (Dense)             (None, 512)               262656    
_________________________________________________________________
dense_20 (Dense)             (None, 512)               262656    
_________________________________________________________________
dense_21 (Dense)             (None, 512)               262656    
_________________________________________________________________
dense_22 (Dense)             (None, 512)               262656    
_________________________________________________________________
dense_23 (Dense)             (None, 512)               262656    
_________________________________________________________________
dense_24 (Dense)             (None, 512)               262656    
_________________________________________________________________
dense_25 (Dense)             (None, 512)               262656    
_________________________________________________________________
dense_26 (Dense)             (None, 512)               262656    
_________________________________________________________________
dense_27 (Dense)             (None, 512)               262656    
_________________________________________________________________
dense_28 (Dense)             (None, 512)               262656    
_________________________________________________________________
dense_29 (Dense)             (None, 512)               262656    
_________________________________________________________________
dense_30 (Dense)             (None, 512)               262656    
_________________________________________________________________
dense_31 (Dense)             (None, 512)               262656    
_________________________________________________________________
dense_32 (Dense)             (None, 512)               262656    
_________________________________________________________________
dense_33 (Dense)             (None, 512)               262656    
_________________________________________________________________
dense_34 (Dense)             (None, 512)               262656    
_________________________________________________________________
dense_35 (Dense)             (None, 512)               262656    
_________________________________________________________________
dense_36 (Dense)             (None, 512)               262656    
_________________________________________________________________
dense_37 (Dense)             (None, 512)               262656    
_________________________________________________________________
dense_38 (Dense)             (None, 512)               262656    
_________________________________________________________________
dense_39 (Dense)             (None, 512)               262656    
_________________________________________________________________
logit (Dense)                (None, 10)                5130      
=================================================================
Total params: 10,650,634
Trainable params: 10,650,634
Non-trainable params: 0
_________________________________________________________________

过拟合解决方法

增加训练数据量

  1. 采集更多的数据
  1. 数据增广(image augmentation):对已有数据做一系列随机改变,来产生相似但又不同的训练样本,从而扩大训练数据集的规模。这里不讲数据增广,有兴趣的可以查看tf.keras.preprocessing.image.ImageDataGenerator这个api。

减小模型复杂度

  1. 减少隐层
  1. 减少神经元个数

加正则项

tf.keras.regularizers.l2

用到的参数:

  • l:惩罚项,默认为0.01。

提前停止

tf.keras.callbacks.EarlyStopping

用到的参数:

  • monitor:监控的数据,一般为'val_loss'。
  • min_delta:定义模型改善的最小量,只有大于min_delta才会认为模型有改善,默认为0。
  • patience:有多少个epoch,模型没有改善,训练就会停止,默认为0。
  • restore_best_weights:是否使用监控数据最好的模型参数,如果是False,使用的是训练最后一步的模型参数,默认False。
# 设置EarlyStopping,val_loss经过10个epoch都没有改善就停止训练
earlystop = EarlyStopping(monitor='val_loss', min_delta=1e-4, 
                          patience=10, restore_best_weights=True)

Dropout

tf.keras.layers.Dropout

用到的参数:

  • rate:神经元失效的概率,取值0~1的浮点数。
  • seed:随机种子,取正整数。
  • name:输入字符串,给该层设置一个名称。

设置TensorBoard

tf.keras.callbacks.TensorBoard

用到的参数:

  • log_dir:tensorboard日志保存路径,默认为当前目录的'logs'文件。
  • histogram_freq:权重和截距的分布直方图的统计频率(多少个epoch或step统计一次),默认为0,也就是不统计频率,也不画出直方图。
  • write_graph:是否画出网络结构图,默认True。
  • write_images:是否画出输入图片,默认False。
  • update_freq:更新日志的频率,可选项有'batch'、'epoch'和正整数,默认'epoch'。如果选择'batch',表示每个batch都记录模型的loss和metrics;'epoch'同理;如果设定为正整数,如1000,表示每1000个batch记录一次模型的loss和metrics。注意,如果太频繁记录会导致模型训练很慢。
# 创建保存路径
model_name = "model-{}".format(int(time.time()))
logdir = os.path.join('logs', model_name)
if not os.path.exists(logdir):
    os.makedirs(logdir)

tensorboard = TensorBoard(log_dir=logdir, histogram_freq=1, write_graph=True, write_images=True, update_freq='epoch')

模型训练

tf.keras.Sequential.fit

用到的参数:

  • x:输入数据。
  • y:输入标签。
  • batch_size:一次梯度更新使用的数据量。
  • epochs:数据集跑多少轮模型训练,一轮表示整个数据集训练一次。
  • validation_split:验证集占总数据量的比例,取值0~1。
  • shuffle:每轮训练是否打乱数据顺序,默认True。
  • callbacks:回调函数,它是一组在训练的特定阶段被调用的函数集,你可以使用回调函数来观察训练过程中网络内部的状态和统计信息,通过传递回调函数列表到模型的.fit()中,即可在给定的训练阶段调用该函数集中的函数。

返回:History对象,History.history属性会记录每一轮训练集和验证集的损失函数值和评价指标。

# 开始模型训练
history = model.fit(x=x_train, y=y_train, batch_size=32,
                     epochs=200, validation_split=0.4,
                     shuffle=True, callbacks=[tensorboard])
Epoch 1/200
 2/19 [==>...........................] - ETA: 11s - loss: 2.4172 - accuracy: 0.0781WARNING:tensorflow:Method (on_train_batch_end) is slow compared to the batch update (0.704117). Check your callbacks.
19/19 [==============================] - 5s 285ms/step - loss: 2.3712 - accuracy: 0.1500 - val_loss: 2.3191 - val_accuracy: 0.1750
Epoch 2/200
19/19 [==============================] - 4s 184ms/step - loss: 2.2486 - accuracy: 0.3300 - val_loss: 2.1968 - val_accuracy: 0.4250
Epoch 3/200
19/19 [==============================] - 4s 190ms/step - loss: 2.1014 - accuracy: 0.4600 - val_loss: 2.0500 - val_accuracy: 0.4600
Epoch 4/200
19/19 [==============================] - 4s 192ms/step - loss: 1.9412 - accuracy: 0.5150 - val_loss: 1.9069 - val_accuracy: 0.4700
Epoch 5/200
19/19 [==============================] - 3s 183ms/step - loss: 1.7935 - accuracy: 0.5383 - val_loss: 1.7905 - val_accuracy: 0.4875
Epoch 6/200
19/19 [==============================] - 4s 187ms/step - loss: 1.6675 - accuracy: 0.5633 - val_loss: 1.6802 - val_accuracy: 0.5050
Epoch 7/200
19/19 [==============================] - 4s 189ms/step - loss: 1.5546 - accuracy: 0.6017 - val_loss: 1.5876 - val_accuracy: 0.5250
Epoch 8/200
19/19 [==============================] - 4s 190ms/step - loss: 1.4534 - accuracy: 0.6233 - val_loss: 1.5052 - val_accuracy: 0.5575
Epoch 9/200
19/19 [==============================] - 4s 188ms/step - loss: 1.3624 - accuracy: 0.6517 - val_loss: 1.4285 - val_accuracy: 0.5875
Epoch 10/200
19/19 [==============================] - 4s 189ms/step - loss: 1.2894 - accuracy: 0.6733 - val_loss: 1.3671 - val_accuracy: 0.6100
Epoch 11/200
19/19 [==============================] - 4s 188ms/step - loss: 1.2197 - accuracy: 0.7050 - val_loss: 1.3083 - val_accuracy: 0.6400
Epoch 12/200
19/19 [==============================] - 4s 187ms/step - loss: 1.1535 - accuracy: 0.7400 - val_loss: 1.2598 - val_accuracy: 0.6525
Epoch 13/200
19/19 [==============================] - 3s 180ms/step - loss: 1.0948 - accuracy: 0.7533 - val_loss: 1.2115 - val_accuracy: 0.6825
Epoch 14/200
19/19 [==============================] - 4s 194ms/step - loss: 1.0389 - accuracy: 0.7800 - val_loss: 1.1674 - val_accuracy: 0.7025
Epoch 15/200
19/19 [==============================] - 3s 182ms/step - loss: 0.9875 - accuracy: 0.7917 - val_loss: 1.1367 - val_accuracy: 0.6900
Epoch 16/200
19/19 [==============================] - 4s 189ms/step - loss: 0.9337 - accuracy: 0.8067 - val_loss: 1.1019 - val_accuracy: 0.7050
Epoch 17/200
19/19 [==============================] - 4s 197ms/step - loss: 0.8906 - accuracy: 0.8200 - val_loss: 1.0556 - val_accuracy: 0.7250
Epoch 18/200
19/19 [==============================] - 4s 188ms/step - loss: 0.8467 - accuracy: 0.8300 - val_loss: 1.0149 - val_accuracy: 0.7450
Epoch 19/200
19/19 [==============================] - 4s 197ms/step - loss: 0.8050 - accuracy: 0.8483 - val_loss: 0.9865 - val_accuracy: 0.7525
Epoch 20/200
19/19 [==============================] - 3s 180ms/step - loss: 0.7667 - accuracy: 0.8700 - val_loss: 0.9653 - val_accuracy: 0.7575
Epoch 21/200
19/19 [==============================] - 4s 188ms/step - loss: 0.7319 - accuracy: 0.8783 - val_loss: 0.9364 - val_accuracy: 0.7725
Epoch 22/200
19/19 [==============================] - 4s 188ms/step - loss: 0.6989 - accuracy: 0.8833 - val_loss: 0.9158 - val_accuracy: 0.7875
Epoch 23/200
19/19 [==============================] - 4s 187ms/step - loss: 0.6696 - accuracy: 0.8917 - val_loss: 0.8995 - val_accuracy: 0.7875
Epoch 24/200
19/19 [==============================] - 3s 182ms/step - loss: 0.6449 - accuracy: 0.8983 - val_loss: 0.8799 - val_accuracy: 0.7825
Epoch 25/200
19/19 [==============================] - 4s 185ms/step - loss: 0.6143 - accuracy: 0.9200 - val_loss: 0.8642 - val_accuracy: 0.7825
Epoch 26/200
19/19 [==============================] - 4s 194ms/step - loss: 0.5879 - accuracy: 0.9150 - val_loss: 0.8607 - val_accuracy: 0.7925
Epoch 27/200
19/19 [==============================] - 4s 216ms/step - loss: 0.5699 - accuracy: 0.9183 - val_loss: 0.8380 - val_accuracy: 0.8100
Epoch 28/200
19/19 [==============================] - 5s 237ms/step - loss: 0.5476 - accuracy: 0.9250 - val_loss: 0.8264 - val_accuracy: 0.7975
Epoch 29/200
19/19 [==============================] - 4s 234ms/step - loss: 0.5273 - accuracy: 0.9317 - val_loss: 0.8240 - val_accuracy: 0.8050
Epoch 30/200
19/19 [==============================] - 5s 238ms/step - loss: 0.5063 - accuracy: 0.9350 - val_loss: 0.8178 - val_accuracy: 0.8100
Epoch 31/200
19/19 [==============================] - 4s 235ms/step - loss: 0.4863 - accuracy: 0.9433 - val_loss: 0.8144 - val_accuracy: 0.8000
Epoch 32/200
19/19 [==============================] - 4s 222ms/step - loss: 0.4730 - accuracy: 0.9433 - val_loss: 0.8116 - val_accuracy: 0.8225
Epoch 33/200
19/19 [==============================] - 4s 232ms/step - loss: 0.4546 - accuracy: 0.9533 - val_loss: 0.8066 - val_accuracy: 0.7950
Epoch 34/200
19/19 [==============================] - 4s 222ms/step - loss: 0.4411 - accuracy: 0.9500 - val_loss: 0.7922 - val_accuracy: 0.8125
Epoch 35/200
19/19 [==============================] - 4s 229ms/step - loss: 0.4260 - accuracy: 0.9567 - val_loss: 0.7828 - val_accuracy: 0.8150
Epoch 36/200
19/19 [==============================] - 5s 238ms/step - loss: 0.4102 - accuracy: 0.9550 - val_loss: 0.7931 - val_accuracy: 0.8150
Epoch 37/200
19/19 [==============================] - 4s 186ms/step - loss: 0.4000 - accuracy: 0.9567 - val_loss: 0.7897 - val_accuracy: 0.8050
Epoch 38/200
19/19 [==============================] - 4s 190ms/step - loss: 0.3893 - accuracy: 0.9650 - val_loss: 0.7867 - val_accuracy: 0.8025
Epoch 39/200
19/19 [==============================] - 3s 180ms/step - loss: 0.3740 - accuracy: 0.9667 - val_loss: 0.7854 - val_accuracy: 0.8000
Epoch 40/200
19/19 [==============================] - 4s 185ms/step - loss: 0.3659 - accuracy: 0.9683 - val_loss: 0.7773 - val_accuracy: 0.8100
Epoch 41/200
19/19 [==============================] - 4s 193ms/step - loss: 0.3557 - accuracy: 0.9717 - val_loss: 0.7713 - val_accuracy: 0.7975
Epoch 42/200
19/19 [==============================] - 4s 196ms/step - loss: 0.3471 - accuracy: 0.9717 - val_loss: 0.7718 - val_accuracy: 0.8050
Epoch 43/200
19/19 [==============================] - 4s 188ms/step - loss: 0.3352 - accuracy: 0.9767 - val_loss: 0.7745 - val_accuracy: 0.8100
Epoch 44/200
19/19 [==============================] - 3s 181ms/step - loss: 0.3277 - accuracy: 0.9800 - val_loss: 0.7695 - val_accuracy: 0.8100
Epoch 45/200
19/19 [==============================] - 4s 191ms/step - loss: 0.3181 - accuracy: 0.9733 - val_loss: 0.7641 - val_accuracy: 0.8075
Epoch 46/200
19/19 [==============================] - 4s 193ms/step - loss: 0.3084 - accuracy: 0.9900 - val_loss: 0.7651 - val_accuracy: 0.8125
Epoch 47/200
19/19 [==============================] - 4s 188ms/step - loss: 0.3010 - accuracy: 0.9900 - val_loss: 0.7661 - val_accuracy: 0.8075
Epoch 48/200
19/19 [==============================] - 4s 195ms/step - loss: 0.2928 - accuracy: 0.9883 - val_loss: 0.7723 - val_accuracy: 0.8000
Epoch 49/200
19/19 [==============================] - 4s 187ms/step - loss: 0.2864 - accuracy: 0.9900 - val_loss: 0.7715 - val_accuracy: 0.8075
Epoch 50/200
19/19 [==============================] - 4s 205ms/step - loss: 0.2798 - accuracy: 0.9917 - val_loss: 0.7827 - val_accuracy: 0.7900
Epoch 51/200
19/19 [==============================] - 3s 181ms/step - loss: 0.2742 - accuracy: 0.9933 - val_loss: 0.7697 - val_accuracy: 0.8050
Epoch 52/200
19/19 [==============================] - 4s 189ms/step - loss: 0.2658 - accuracy: 0.9933 - val_loss: 0.7735 - val_accuracy: 0.8050
Epoch 53/200
19/19 [==============================] - 4s 187ms/step - loss: 0.2609 - accuracy: 0.9933 - val_loss: 0.7763 - val_accuracy: 0.7975
Epoch 54/200
19/19 [==============================] - 4s 191ms/step - loss: 0.2566 - accuracy: 0.9917 - val_loss: 0.7694 - val_accuracy: 0.8075
Epoch 55/200
19/19 [==============================] - 4s 196ms/step - loss: 0.2490 - accuracy: 0.9933 - val_loss: 0.7626 - val_accuracy: 0.8100
Epoch 56/200
19/19 [==============================] - 3s 184ms/step - loss: 0.2459 - accuracy: 0.9933 - val_loss: 0.7731 - val_accuracy: 0.8075
Epoch 57/200
19/19 [==============================] - 4s 191ms/step - loss: 0.2394 - accuracy: 0.9967 - val_loss: 0.7745 - val_accuracy: 0.8000
Epoch 58/200
19/19 [==============================] - 4s 185ms/step - loss: 0.2341 - accuracy: 0.9983 - val_loss: 0.7792 - val_accuracy: 0.7975
Epoch 59/200
19/19 [==============================] - 3s 181ms/step - loss: 0.2302 - accuracy: 0.9950 - val_loss: 0.7717 - val_accuracy: 0.7950
Epoch 60/200
19/19 [==============================] - 3s 183ms/step - loss: 0.2263 - accuracy: 0.9983 - val_loss: 0.7832 - val_accuracy: 0.7975
Epoch 61/200
19/19 [==============================] - 3s 181ms/step - loss: 0.2218 - accuracy: 0.9967 - val_loss: 0.7798 - val_accuracy: 0.8025
Epoch 62/200
19/19 [==============================] - 3s 180ms/step - loss: 0.2182 - accuracy: 0.9983 - val_loss: 0.7756 - val_accuracy: 0.8025
Epoch 63/200
19/19 [==============================] - 3s 181ms/step - loss: 0.2140 - accuracy: 0.9983 - val_loss: 0.7765 - val_accuracy: 0.8000
Epoch 64/200
19/19 [==============================] - 4s 197ms/step - loss: 0.2114 - accuracy: 0.9983 - val_loss: 0.7844 - val_accuracy: 0.7975
Epoch 65/200
19/19 [==============================] - 3s 184ms/step - loss: 0.2087 - accuracy: 0.9983 - val_loss: 0.7819 - val_accuracy: 0.8075
Epoch 66/200
19/19 [==============================] - 4s 186ms/step - loss: 0.2048 - accuracy: 0.9983 - val_loss: 0.7815 - val_accuracy: 0.8000
Epoch 67/200
19/19 [==============================] - 3s 174ms/step - loss: 0.2019 - accuracy: 0.9983 - val_loss: 0.7838 - val_accuracy: 0.7950
Epoch 68/200
19/19 [==============================] - 3s 179ms/step - loss: 0.1987 - accuracy: 1.0000 - val_loss: 0.7768 - val_accuracy: 0.7950
Epoch 69/200
19/19 [==============================] - 3s 181ms/step - loss: 0.1960 - accuracy: 1.0000 - val_loss: 0.7880 - val_accuracy: 0.7950
Epoch 70/200
19/19 [==============================] - 4s 191ms/step - loss: 0.1941 - accuracy: 1.0000 - val_loss: 0.7923 - val_accuracy: 0.7975
Epoch 71/200
19/19 [==============================] - 4s 186ms/step - loss: 0.1907 - accuracy: 1.0000 - val_loss: 0.7883 - val_accuracy: 0.7975
Epoch 72/200
19/19 [==============================] - 3s 180ms/step - loss: 0.1876 - accuracy: 0.9983 - val_loss: 0.7864 - val_accuracy: 0.7975
Epoch 73/200
19/19 [==============================] - 3s 181ms/step - loss: 0.1856 - accuracy: 1.0000 - val_loss: 0.7858 - val_accuracy: 0.7950
Epoch 74/200
19/19 [==============================] - 3s 183ms/step - loss: 0.1825 - accuracy: 1.0000 - val_loss: 0.7901 - val_accuracy: 0.8025
Epoch 75/200
19/19 [==============================] - 4s 189ms/step - loss: 0.1813 - accuracy: 1.0000 - val_loss: 0.7878 - val_accuracy: 0.7925
Epoch 76/200
19/19 [==============================] - 3s 178ms/step - loss: 0.1792 - accuracy: 1.0000 - val_loss: 0.7900 - val_accuracy: 0.7925
Epoch 77/200
19/19 [==============================] - 4s 194ms/step - loss: 0.1768 - accuracy: 1.0000 - val_loss: 0.7964 - val_accuracy: 0.7775
Epoch 78/200
19/19 [==============================] - 3s 181ms/step - loss: 0.1754 - accuracy: 1.0000 - val_loss: 0.7921 - val_accuracy: 0.7950
Epoch 79/200
19/19 [==============================] - 3s 183ms/step - loss: 0.1732 - accuracy: 1.0000 - val_loss: 0.7962 - val_accuracy: 0.7875
Epoch 80/200
19/19 [==============================] - 4s 189ms/step - loss: 0.1712 - accuracy: 1.0000 - val_loss: 0.7909 - val_accuracy: 0.7925
Epoch 81/200
19/19 [==============================] - 4s 184ms/step - loss: 0.1692 - accuracy: 1.0000 - val_loss: 0.7955 - val_accuracy: 0.7875
Epoch 82/200
19/19 [==============================] - 3s 181ms/step - loss: 0.1680 - accuracy: 1.0000 - val_loss: 0.7954 - val_accuracy: 0.7975
Epoch 83/200
19/19 [==============================] - 3s 183ms/step - loss: 0.1661 - accuracy: 1.0000 - val_loss: 0.8050 - val_accuracy: 0.7775
Epoch 84/200
19/19 [==============================] - 4s 190ms/step - loss: 0.1649 - accuracy: 1.0000 - val_loss: 0.8020 - val_accuracy: 0.7850
Epoch 85/200
19/19 [==============================] - 3s 184ms/step - loss: 0.1636 - accuracy: 1.0000 - val_loss: 0.8026 - val_accuracy: 0.7875
Epoch 86/200
19/19 [==============================] - 3s 181ms/step - loss: 0.1624 - accuracy: 1.0000 - val_loss: 0.8027 - val_accuracy: 0.7875
Epoch 87/200
19/19 [==============================] - 4s 187ms/step - loss: 0.1603 - accuracy: 1.0000 - val_loss: 0.8082 - val_accuracy: 0.7775
Epoch 88/200
19/19 [==============================] - 3s 181ms/step - loss: 0.1589 - accuracy: 1.0000 - val_loss: 0.8035 - val_accuracy: 0.7900
Epoch 89/200
19/19 [==============================] - 3s 181ms/step - loss: 0.1578 - accuracy: 1.0000 - val_loss: 0.8086 - val_accuracy: 0.7850
Epoch 90/200
19/19 [==============================] - 3s 176ms/step - loss: 0.1568 - accuracy: 1.0000 - val_loss: 0.8114 - val_accuracy: 0.7725
Epoch 91/200
19/19 [==============================] - 3s 179ms/step - loss: 0.1554 - accuracy: 1.0000 - val_loss: 0.8096 - val_accuracy: 0.7850
Epoch 92/200
19/19 [==============================] - 3s 179ms/step - loss: 0.1542 - accuracy: 1.0000 - val_loss: 0.8129 - val_accuracy: 0.7850
Epoch 93/200
19/19 [==============================] - 4s 185ms/step - loss: 0.1534 - accuracy: 1.0000 - val_loss: 0.8090 - val_accuracy: 0.7875
Epoch 94/200
19/19 [==============================] - 3s 183ms/step - loss: 0.1522 - accuracy: 1.0000 - val_loss: 0.8057 - val_accuracy: 0.7900
Epoch 95/200
19/19 [==============================] - 3s 184ms/step - loss: 0.1513 - accuracy: 1.0000 - val_loss: 0.8085 - val_accuracy: 0.7925
Epoch 96/200
19/19 [==============================] - 4s 193ms/step - loss: 0.1499 - accuracy: 1.0000 - val_loss: 0.8110 - val_accuracy: 0.7875
Epoch 97/200
19/19 [==============================] - 3s 183ms/step - loss: 0.1490 - accuracy: 1.0000 - val_loss: 0.8170 - val_accuracy: 0.7850
Epoch 98/200
19/19 [==============================] - 3s 176ms/step - loss: 0.1481 - accuracy: 1.0000 - val_loss: 0.8100 - val_accuracy: 0.7875
Epoch 99/200
19/19 [==============================] - 3s 183ms/step - loss: 0.1471 - accuracy: 1.0000 - val_loss: 0.8174 - val_accuracy: 0.7875
Epoch 100/200
19/19 [==============================] - 4s 187ms/step - loss: 0.1465 - accuracy: 1.0000 - val_loss: 0.8178 - val_accuracy: 0.7725
Epoch 101/200
19/19 [==============================] - 4s 188ms/step - loss: 0.1457 - accuracy: 1.0000 - val_loss: 0.8147 - val_accuracy: 0.7900
Epoch 102/200
19/19 [==============================] - 3s 173ms/step - loss: 0.1447 - accuracy: 1.0000 - val_loss: 0.8154 - val_accuracy: 0.7925
Epoch 103/200
19/19 [==============================] - 3s 184ms/step - loss: 0.1437 - accuracy: 1.0000 - val_loss: 0.8284 - val_accuracy: 0.7775
Epoch 104/200
19/19 [==============================] - 3s 173ms/step - loss: 0.1434 - accuracy: 1.0000 - val_loss: 0.8199 - val_accuracy: 0.7900
Epoch 105/200
19/19 [==============================] - 3s 179ms/step - loss: 0.1424 - accuracy: 1.0000 - val_loss: 0.8194 - val_accuracy: 0.7825
Epoch 106/200
19/19 [==============================] - 3s 178ms/step - loss: 0.1415 - accuracy: 1.0000 - val_loss: 0.8200 - val_accuracy: 0.7950
Epoch 107/200
19/19 [==============================] - 3s 183ms/step - loss: 0.1410 - accuracy: 1.0000 - val_loss: 0.8213 - val_accuracy: 0.7875
Epoch 108/200
19/19 [==============================] - 3s 183ms/step - loss: 0.1405 - accuracy: 1.0000 - val_loss: 0.8229 - val_accuracy: 0.7850
Epoch 109/200
19/19 [==============================] - 3s 179ms/step - loss: 0.1395 - accuracy: 1.0000 - val_loss: 0.8242 - val_accuracy: 0.7950
Epoch 110/200
19/19 [==============================] - 3s 178ms/step - loss: 0.1392 - accuracy: 1.0000 - val_loss: 0.8255 - val_accuracy: 0.7900
Epoch 111/200
19/19 [==============================] - 3s 175ms/step - loss: 0.1384 - accuracy: 1.0000 - val_loss: 0.8224 - val_accuracy: 0.7925
Epoch 112/200
19/19 [==============================] - 4s 190ms/step - loss: 0.1376 - accuracy: 1.0000 - val_loss: 0.8268 - val_accuracy: 0.7825
Epoch 113/200
19/19 [==============================] - 3s 180ms/step - loss: 0.1370 - accuracy: 1.0000 - val_loss: 0.8262 - val_accuracy: 0.7900
Epoch 114/200
19/19 [==============================] - 3s 179ms/step - loss: 0.1366 - accuracy: 1.0000 - val_loss: 0.8267 - val_accuracy: 0.7875
Epoch 115/200
19/19 [==============================] - 3s 178ms/step - loss: 0.1360 - accuracy: 1.0000 - val_loss: 0.8247 - val_accuracy: 0.7875
Epoch 116/200
19/19 [==============================] - 4s 197ms/step - loss: 0.1355 - accuracy: 1.0000 - val_loss: 0.8289 - val_accuracy: 0.7850
Epoch 117/200
19/19 [==============================] - 3s 184ms/step - loss: 0.1349 - accuracy: 1.0000 - val_loss: 0.8348 - val_accuracy: 0.7800
Epoch 118/200
19/19 [==============================] - 4s 208ms/step - loss: 0.1344 - accuracy: 1.0000 - val_loss: 0.8326 - val_accuracy: 0.7850
Epoch 119/200
19/19 [==============================] - 4s 185ms/step - loss: 0.1337 - accuracy: 1.0000 - val_loss: 0.8308 - val_accuracy: 0.7875
Epoch 120/200
19/19 [==============================] - 3s 184ms/step - loss: 0.1333 - accuracy: 1.0000 - val_loss: 0.8290 - val_accuracy: 0.7900
Epoch 121/200
19/19 [==============================] - 3s 182ms/step - loss: 0.1331 - accuracy: 1.0000 - val_loss: 0.8291 - val_accuracy: 0.7950
Epoch 122/200
19/19 [==============================] - 3s 179ms/step - loss: 0.1322 - accuracy: 1.0000 - val_loss: 0.8297 - val_accuracy: 0.7950
Epoch 123/200
19/19 [==============================] - 3s 179ms/step - loss: 0.1318 - accuracy: 1.0000 - val_loss: 0.8348 - val_accuracy: 0.7850
Epoch 124/200
19/19 [==============================] - 4s 187ms/step - loss: 0.1315 - accuracy: 1.0000 - val_loss: 0.8318 - val_accuracy: 0.7925
Epoch 125/200
19/19 [==============================] - 4s 185ms/step - loss: 0.1311 - accuracy: 1.0000 - val_loss: 0.8383 - val_accuracy: 0.7800
Epoch 126/200
19/19 [==============================] - 4s 198ms/step - loss: 0.1306 - accuracy: 1.0000 - val_loss: 0.8346 - val_accuracy: 0.7900
Epoch 127/200
19/19 [==============================] - 3s 182ms/step - loss: 0.1301 - accuracy: 1.0000 - val_loss: 0.8399 - val_accuracy: 0.7850
Epoch 128/200
19/19 [==============================] - 4s 185ms/step - loss: 0.1298 - accuracy: 1.0000 - val_loss: 0.8375 - val_accuracy: 0.8000
Epoch 129/200
19/19 [==============================] - 3s 183ms/step - loss: 0.1295 - accuracy: 1.0000 - val_loss: 0.8421 - val_accuracy: 0.7925
Epoch 130/200
19/19 [==============================] - 3s 183ms/step - loss: 0.1291 - accuracy: 1.0000 - val_loss: 0.8367 - val_accuracy: 0.7875
Epoch 131/200
19/19 [==============================] - 3s 176ms/step - loss: 0.1286 - accuracy: 1.0000 - val_loss: 0.8438 - val_accuracy: 0.7800
Epoch 132/200
19/19 [==============================] - 4s 202ms/step - loss: 0.1282 - accuracy: 1.0000 - val_loss: 0.8388 - val_accuracy: 0.7950
Epoch 133/200
19/19 [==============================] - 3s 174ms/step - loss: 0.1280 - accuracy: 1.0000 - val_loss: 0.8424 - val_accuracy: 0.7800
Epoch 134/200
19/19 [==============================] - 3s 178ms/step - loss: 0.1276 - accuracy: 1.0000 - val_loss: 0.8439 - val_accuracy: 0.7850
Epoch 135/200
19/19 [==============================] - 3s 180ms/step - loss: 0.1273 - accuracy: 1.0000 - val_loss: 0.8436 - val_accuracy: 0.7825
Epoch 136/200
19/19 [==============================] - 4s 194ms/step - loss: 0.1269 - accuracy: 1.0000 - val_loss: 0.8436 - val_accuracy: 0.7900
Epoch 137/200
19/19 [==============================] - 3s 181ms/step - loss: 0.1266 - accuracy: 1.0000 - val_loss: 0.8466 - val_accuracy: 0.7850
Epoch 138/200
19/19 [==============================] - 4s 184ms/step - loss: 0.1262 - accuracy: 1.0000 - val_loss: 0.8436 - val_accuracy: 0.7950
Epoch 139/200
19/19 [==============================] - 3s 175ms/step - loss: 0.1260 - accuracy: 1.0000 - val_loss: 0.8465 - val_accuracy: 0.7900
Epoch 140/200
19/19 [==============================] - 4s 188ms/step - loss: 0.1255 - accuracy: 1.0000 - val_loss: 0.8484 - val_accuracy: 0.7825
Epoch 141/200
19/19 [==============================] - 3s 176ms/step - loss: 0.1254 - accuracy: 1.0000 - val_loss: 0.8450 - val_accuracy: 0.7875
Epoch 142/200
19/19 [==============================] - 3s 181ms/step - loss: 0.1249 - accuracy: 1.0000 - val_loss: 0.8499 - val_accuracy: 0.7800
Epoch 143/200
19/19 [==============================] - 3s 182ms/step - loss: 0.1248 - accuracy: 1.0000 - val_loss: 0.8480 - val_accuracy: 0.7900
Epoch 144/200
19/19 [==============================] - 3s 178ms/step - loss: 0.1245 - accuracy: 1.0000 - val_loss: 0.8482 - val_accuracy: 0.7900
Epoch 145/200
19/19 [==============================] - 3s 179ms/step - loss: 0.1241 - accuracy: 1.0000 - val_loss: 0.8526 - val_accuracy: 0.7750
Epoch 146/200
19/19 [==============================] - 4s 186ms/step - loss: 0.1239 - accuracy: 1.0000 - val_loss: 0.8527 - val_accuracy: 0.7825
Epoch 147/200
19/19 [==============================] - 4s 186ms/step - loss: 0.1237 - accuracy: 1.0000 - val_loss: 0.8518 - val_accuracy: 0.7875
Epoch 148/200
19/19 [==============================] - 4s 188ms/step - loss: 0.1233 - accuracy: 1.0000 - val_loss: 0.8562 - val_accuracy: 0.7850
Epoch 149/200
19/19 [==============================] - 4s 193ms/step - loss: 0.1232 - accuracy: 1.0000 - val_loss: 0.8555 - val_accuracy: 0.7900
Epoch 150/200
19/19 [==============================] - 4s 188ms/step - loss: 0.1229 - accuracy: 1.0000 - val_loss: 0.8503 - val_accuracy: 0.7950
Epoch 151/200
19/19 [==============================] - 4s 189ms/step - loss: 0.1226 - accuracy: 1.0000 - val_loss: 0.8553 - val_accuracy: 0.7825
Epoch 152/200
19/19 [==============================] - 3s 181ms/step - loss: 0.1224 - accuracy: 1.0000 - val_loss: 0.8513 - val_accuracy: 0.7950
Epoch 153/200
19/19 [==============================] - 4s 190ms/step - loss: 0.1222 - accuracy: 1.0000 - val_loss: 0.8551 - val_accuracy: 0.7900
Epoch 154/200
19/19 [==============================] - 3s 181ms/step - loss: 0.1219 - accuracy: 1.0000 - val_loss: 0.8555 - val_accuracy: 0.7825
Epoch 155/200
19/19 [==============================] - 4s 185ms/step - loss: 0.1217 - accuracy: 1.0000 - val_loss: 0.8560 - val_accuracy: 0.7850
Epoch 156/200
19/19 [==============================] - 4s 185ms/step - loss: 0.1215 - accuracy: 1.0000 - val_loss: 0.8592 - val_accuracy: 0.7875
Epoch 157/200
19/19 [==============================] - 4s 187ms/step - loss: 0.1212 - accuracy: 1.0000 - val_loss: 0.8556 - val_accuracy: 0.7950
Epoch 158/200
19/19 [==============================] - 3s 181ms/step - loss: 0.1210 - accuracy: 1.0000 - val_loss: 0.8584 - val_accuracy: 0.7800
Epoch 159/200
19/19 [==============================] - 3s 179ms/step - loss: 0.1209 - accuracy: 1.0000 - val_loss: 0.8619 - val_accuracy: 0.7700
Epoch 160/200
19/19 [==============================] - 3s 177ms/step - loss: 0.1206 - accuracy: 1.0000 - val_loss: 0.8600 - val_accuracy: 0.7875
Epoch 161/200
19/19 [==============================] - 3s 180ms/step - loss: 0.1204 - accuracy: 1.0000 - val_loss: 0.8597 - val_accuracy: 0.7850
Epoch 162/200
19/19 [==============================] - 3s 184ms/step - loss: 0.1202 - accuracy: 1.0000 - val_loss: 0.8598 - val_accuracy: 0.7875
Epoch 163/200
19/19 [==============================] - 4s 186ms/step - loss: 0.1200 - accuracy: 1.0000 - val_loss: 0.8616 - val_accuracy: 0.7825
Epoch 164/200
19/19 [==============================] - 3s 182ms/step - loss: 0.1198 - accuracy: 1.0000 - val_loss: 0.8649 - val_accuracy: 0.7750
Epoch 165/200
19/19 [==============================] - 3s 183ms/step - loss: 0.1197 - accuracy: 1.0000 - val_loss: 0.8616 - val_accuracy: 0.7850
Epoch 166/200
19/19 [==============================] - 3s 183ms/step - loss: 0.1195 - accuracy: 1.0000 - val_loss: 0.8612 - val_accuracy: 0.7850
Epoch 167/200
19/19 [==============================] - 4s 187ms/step - loss: 0.1193 - accuracy: 1.0000 - val_loss: 0.8640 - val_accuracy: 0.7850
Epoch 168/200
19/19 [==============================] - 3s 178ms/step - loss: 0.1191 - accuracy: 1.0000 - val_loss: 0.8617 - val_accuracy: 0.7875
Epoch 169/200
19/19 [==============================] - 4s 190ms/step - loss: 0.1190 - accuracy: 1.0000 - val_loss: 0.8662 - val_accuracy: 0.7825
Epoch 170/200
19/19 [==============================] - 3s 180ms/step - loss: 0.1187 - accuracy: 1.0000 - val_loss: 0.8653 - val_accuracy: 0.7825
Epoch 171/200
19/19 [==============================] - 3s 180ms/step - loss: 0.1186 - accuracy: 1.0000 - val_loss: 0.8663 - val_accuracy: 0.7850
Epoch 172/200
19/19 [==============================] - 3s 175ms/step - loss: 0.1184 - accuracy: 1.0000 - val_loss: 0.8658 - val_accuracy: 0.7850
Epoch 173/200
19/19 [==============================] - 4s 186ms/step - loss: 0.1182 - accuracy: 1.0000 - val_loss: 0.8677 - val_accuracy: 0.7850
Epoch 174/200
19/19 [==============================] - 3s 180ms/step - loss: 0.1181 - accuracy: 1.0000 - val_loss: 0.8673 - val_accuracy: 0.7850
Epoch 175/200
19/19 [==============================] - 3s 176ms/step - loss: 0.1179 - accuracy: 1.0000 - val_loss: 0.8673 - val_accuracy: 0.7850
Epoch 176/200
19/19 [==============================] - 4s 185ms/step - loss: 0.1178 - accuracy: 1.0000 - val_loss: 0.8689 - val_accuracy: 0.7850
Epoch 177/200
19/19 [==============================] - 4s 185ms/step - loss: 0.1176 - accuracy: 1.0000 - val_loss: 0.8699 - val_accuracy: 0.7825
Epoch 178/200
19/19 [==============================] - 3s 180ms/step - loss: 0.1175 - accuracy: 1.0000 - val_loss: 0.8725 - val_accuracy: 0.7750
Epoch 179/200
19/19 [==============================] - 4s 185ms/step - loss: 0.1173 - accuracy: 1.0000 - val_loss: 0.8706 - val_accuracy: 0.7850
Epoch 180/200
19/19 [==============================] - 3s 183ms/step - loss: 0.1171 - accuracy: 1.0000 - val_loss: 0.8692 - val_accuracy: 0.7875
Epoch 181/200
19/19 [==============================] - 3s 182ms/step - loss: 0.1170 - accuracy: 1.0000 - val_loss: 0.8705 - val_accuracy: 0.7825
Epoch 182/200
19/19 [==============================] - 3s 183ms/step - loss: 0.1169 - accuracy: 1.0000 - val_loss: 0.8747 - val_accuracy: 0.7775
Epoch 183/200
19/19 [==============================] - 4s 192ms/step - loss: 0.1168 - accuracy: 1.0000 - val_loss: 0.8735 - val_accuracy: 0.7800
Epoch 184/200
19/19 [==============================] - 3s 181ms/step - loss: 0.1166 - accuracy: 1.0000 - val_loss: 0.8755 - val_accuracy: 0.7800
Epoch 185/200
19/19 [==============================] - 3s 183ms/step - loss: 0.1165 - accuracy: 1.0000 - val_loss: 0.8724 - val_accuracy: 0.7825
Epoch 186/200
19/19 [==============================] - 4s 197ms/step - loss: 0.1163 - accuracy: 1.0000 - val_loss: 0.8739 - val_accuracy: 0.7850
Epoch 187/200
19/19 [==============================] - 3s 178ms/step - loss: 0.1162 - accuracy: 1.0000 - val_loss: 0.8761 - val_accuracy: 0.7800
Epoch 188/200
19/19 [==============================] - 3s 183ms/step - loss: 0.1161 - accuracy: 1.0000 - val_loss: 0.8754 - val_accuracy: 0.7825
Epoch 189/200
19/19 [==============================] - 3s 181ms/step - loss: 0.1159 - accuracy: 1.0000 - val_loss: 0.8771 - val_accuracy: 0.7850
Epoch 190/200
19/19 [==============================] - 3s 184ms/step - loss: 0.1158 - accuracy: 1.0000 - val_loss: 0.8799 - val_accuracy: 0.7775
Epoch 191/200
19/19 [==============================] - 3s 177ms/step - loss: 0.1157 - accuracy: 1.0000 - val_loss: 0.8778 - val_accuracy: 0.7850
Epoch 192/200
19/19 [==============================] - 3s 174ms/step - loss: 0.1155 - accuracy: 1.0000 - val_loss: 0.8754 - val_accuracy: 0.7825
Epoch 193/200
19/19 [==============================] - 3s 176ms/step - loss: 0.1154 - accuracy: 1.0000 - val_loss: 0.8779 - val_accuracy: 0.7850
Epoch 194/200
19/19 [==============================] - 3s 179ms/step - loss: 0.1154 - accuracy: 1.0000 - val_loss: 0.8756 - val_accuracy: 0.7800
Epoch 195/200
19/19 [==============================] - 3s 176ms/step - loss: 0.1152 - accuracy: 1.0000 - val_loss: 0.8782 - val_accuracy: 0.7800
Epoch 196/200
19/19 [==============================] - 4s 186ms/step - loss: 0.1151 - accuracy: 1.0000 - val_loss: 0.8799 - val_accuracy: 0.7800
Epoch 197/200
19/19 [==============================] - 3s 176ms/step - loss: 0.1150 - accuracy: 1.0000 - val_loss: 0.8783 - val_accuracy: 0.7850
Epoch 198/200
19/19 [==============================] - 3s 172ms/step - loss: 0.1149 - accuracy: 1.0000 - val_loss: 0.8776 - val_accuracy: 0.7875
Epoch 199/200
19/19 [==============================] - 3s 173ms/step - loss: 0.1148 - accuracy: 1.0000 - val_loss: 0.8801 - val_accuracy: 0.7825
Epoch 200/200
19/19 [==============================] - 3s 174ms/step - loss: 0.1146 - accuracy: 1.0000 - val_loss: 0.8820 - val_accuracy: 0.7775

画图查看history数据的变化趋势

pd.DataFrame(history.history).plot(figsize=(8, 5))
plt.grid(True)
plt.xlabel('epoch')
plt.show()

在这里插入图片描述

随着epoch的增加,

  • loss(蓝线)不断下降;
  • val_loss(绿线)先下降后抬升;
  • accuracy(黄线)不断上升;
  • val_accuracy(红线)刚开始上升后趋于平稳;

loss(蓝线)和val_loss(绿线)的变化趋势说明模型过拟合了。

测试集评估

loss, accuracy = model.evaluate(x_test, y_test)
313/313 [==============================] - 4s 12ms/step - loss: 0.7756 - accuracy: 0.8212

统计Test Accuracy

原始结果:0.8425

增加数据量(10000张图片):0.9301

减小模型复杂度(单隐层, 100个神经元):0.8511

增加正则项(惩罚项1e-5):0.8409

提前终止:0.8430

Dropout(在最后2个隐层添加,rate=0.5):0.8481

相关文章
|
2月前
|
机器学习/深度学习 算法 调度
14种智能算法优化BP神经网络(14种方法)实现数据预测分类研究(Matlab代码实现)
14种智能算法优化BP神经网络(14种方法)实现数据预测分类研究(Matlab代码实现)
291 0
|
25天前
|
机器学习/深度学习 数据采集 边缘计算
基于灰色神经网络的预测方法
基于灰色神经网络的预测方法
80 0
|
2月前
|
算法 Python
【EI复现】考虑网络动态重构的分布式电源选址定容优化方法(Matlab代码实现)
【EI复现】考虑网络动态重构的分布式电源选址定容优化方法(Matlab代码实现)
|
3月前
|
机器学习/深度学习 数据采集 TensorFlow
基于CNN-GRU-Attention混合神经网络的负荷预测方法(Python代码实现)
基于CNN-GRU-Attention混合神经网络的负荷预测方法(Python代码实现)
124 0
|
4月前
|
存储 Linux 容器
【Container App】在容器中抓取网络包的方法
本文介绍在Azure Container App中安装tcpdump抓取网络包,并通过Storage Account上传抓包文件的方法。内容包括使用curl和nc测试外部接口连通性、长Ping端口、安装tcpdump、抓取网络包、以及通过crul命令上传文件至Azure Storage。适用于需要分析网络请求和排查网络问题的场景。
159 1
|
4月前
|
机器学习/深度学习 边缘计算 算法
基于BP神经网络的电池容量预测方法研究
基于BP神经网络的电池容量预测方法研究
|
6月前
计算网络号的直接方法
子网掩码用于区分IP地址中的网络部分和主机部分,连续的“1”表示网络位,“0”表示主机位。例如,255.255.255.0 的二进制为 11111111.11111111.11111111.00000000,前24位是网络部分。通过子网掩码可提取网络号,如 IP 192.168.1.10 与子网掩码 255.255.255.0 的网络号为 192.168.1.0。此外,文档还介绍了十进制与二进制间的转换方法,帮助理解IP地址的组成与计算。
388 11
|
10月前
|
监控 安全 网络安全
深入解析PDCERF:网络安全应急响应的六阶段方法
PDCERF是网络安全应急响应的六阶段方法,涵盖准备、检测、抑制、根除、恢复和跟进。本文详细解析各阶段目标与操作步骤,并附图例,助读者理解与应用,提升组织应对安全事件的能力。
1414 89
|
8月前
|
缓存 数据中心 网络架构
5个减少网络延迟的简单方法
高速互联网对工作与娱乐至关重要,延迟和断线会严重影响效率和体验。本文探讨了导致连接缓慢的三个关键因素:吞吐量、带宽和延迟,并提供了减少延迟的实用方法。包括重启设备、关闭占用带宽的程序、使用有线连接、优化数据中心位置以及添加内容分发网络 (CDN) 等策略。虽然完全消除延迟不可能,但通过这些方法可显著改善网络性能。
1855 7
|
8月前
|
Kubernetes Shell Windows
【Azure K8S | AKS】在AKS的节点中抓取目标POD的网络包方法分享
在AKS中遇到复杂网络问题时,可通过以下步骤进入特定POD抓取网络包进行分析:1. 使用`kubectl get pods`确认Pod所在Node;2. 通过`kubectl node-shell`登录Node;3. 使用`crictl ps`找到Pod的Container ID;4. 获取PID并使用`nsenter`进入Pod的网络空间;5. 在`/var/tmp`目录下使用`tcpdump`抓包。完成后按Ctrl+C停止抓包。
278 12

热门文章

最新文章