Keras学习4:Callback

首先给出Keras官方文档对Callback的解释:Keras-Callback

回调函数是一组在训练的特定阶段被调用的函数集,你可以使用回调函数来观察训练过程中网络内部的状态和统计信息。通过传递回调函数列表到模型的.fit()中,即可在给定的训练阶段调用该函数集中的函数。

常用的Callback有两种:ModelCheckPoint和EarlyStopping

ModelCheckPoint

该回调函数将在每个epoch后根据训练情况保存模型到filepath

def __init__(self, filepath, monitor='val_loss', verbose=0,
                 save_best_only=False, save_weights_only=False,
                 mode='auto', period=1):

filepath:可以是格式化的字符串,里面的占位符将会被epoch值和传入on_epoch_end的logs关键字所填入

For example: if filepath is weights.{epoch:02d}-{val_loss:.2f}.hdf5, then the model checkpoints will be saved with the epoch number and the validation loss in the filename.

比如:filepath=’./model-{epoch:02d}-{val_acc:.2f}.h5’
若当前为epoch 2, val_acc为24%,则filename=model-02-0.24.h5

monitor:监视器,即需要监视的值,通常为’val_acc’、'val_loss’等

verbose:信息展示模式,0或1

save_best_only:当设置为True时,将只保存在验证集上性能最好的模型

save_best_only: if save_best_only=True, the latest best model according to the quantity monitored will not be overwritten.

mode:‘auto’,‘min’,‘max’之一,在save_best_only=True时决定性能最佳模型的评判准则,例如,当监测值为val_acc时,模式应为max,当检测值为val_loss时,模式应为min。在auto模式下,评价准则由被监测值的名字自动推断

save_weights_only:若设置为True,则只保存模型权重,否则将保存整个模型(包括模型结构,配置信息等)

period:CheckPoint之间的间隔的epoch数,即每隔period个epoch监视一次monitor,决定是否保存当前模型

EarlyStopping

当监测值不再改善时,该回调函数将中止训练,即ML中的早停策略(对抗过拟合的技巧之一,面试经常问到~~)

def __init__(self, monitor='val_loss',
                 min_delta=0, patience=0, verbose=0, mode='auto')

monitor:需要监视的量,如‘val_acc’、‘val_loss’

min_delta:被视作模型性能上升的最小变化量,小于该变化量认为模型性能没有提升

minimum change in the monitored quantity to qualify as an improvement, i.e. an absolute change of less than min_delta, will count as no improvement.

patience:当early stop被激活(如发现loss相比上一个epoch训练没有下降),则经过patience个epoch后停止训练

number of epochs with no improvement after which training will be stopped.

verbose:信息展示模式

mode:‘auto’,‘min’,‘max’之一,在min模式下,如果检测值停止下降则中止训练。在max模式下,当检测值不再上升则停止训练

一个小例子

# bulid network
inputs = Input(shape=(input_dim, 1))
x = inputs
x = Conv1D(1, int(input_dim/num_classes), dilation_rate=num_classes, padding='valid', use_bias=False, name='conv1', kernel_initializer='random_uniform')(x)
x = Flatten()(x)
x = Activation('softmax')(x)
model = Model(inputs, x)

# compile model
adam = Adam(lr=1e-1, decay=1e-5)
model.compile(optimizer=adam, loss='categorical_crossentropy', metrics=['accuracy'])

# define Callback
save_best = ModelCheckpoint(filepath='./model/ep{epoch:d}-acc{val_acc:f}.h5', monitor='val_acc', mode='max', verbose=1, save_best_only=True, period=1)
early_stop = EarlyStopping(monitor='val_acc', patience=50, verbose=2, mode='max')

# train
model.fit(x_train, y_train, epochs=max_epoch, batch_size=batch_size, shuffle=True, verbose=2, validation_data=[x_val, y_val], callbacks=[save_best])

# evaluate
val_result = model.evaluate(x_val, y_val, batch_size=batch_size, verbose=0)
print("val_result = {}".format(val_result))

weight = model.get_layer('conv1').get_weights()
print(weight)
# best_model = load_model(best_model_path)
# best_result = best_model.evaluate(x_val, y_val, batch_size=batch_size, verbose=0)
# print("best_result = {}".format(best_result))

你可能感兴趣的:(Keras学习,DL,ML,Keras)