keras训练
fit(
self,
x,
y,
batch_size=32,
nb_epoch=10,
verbose=1,
callbacks=[],
validation_split=0.0,
validation_data=None,
shuffle=True,
class_weight=None,
sample_weight=None
)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
保存模型结构、训练出来的权重、及优化器状态
keras 的 callback参数可以帮助我们实现在训练过程中的适当时机被调用。实现实时保存训练模型以及训练参数。
keras.callbacks.ModelCheckpoint(
filepath,
monitor=‘val_loss’,
verbose=0,
save_best_only=False,
save_weights_only=False,
mode=‘auto’,
period=1
)
用EarlyStopping回调函数
from keras.callbacksimport EarlyStopping
keras.callbacks.EarlyStopping(
monitor=‘val_loss’,
patience=0,
verbose=0,
mode=‘auto’
)
model.fit(X, y, validation_split=0.2, callbacks=[early_stopping])
keras.callbacks.LearningRateScheduler(schedule)
schedule:函数,该函数以epoch号为参数(从0算起的整数),返回一个新学习率(浮点数)
也可以让keras自动调整学习率
keras.callbacks.ReduceLROnPlateau(
monitor=‘val_loss’,
factor=0.1,
patience=10,
verbose=0,
mode=‘auto’,
epsilon=0.0001,
cooldown=0,
min_lr=0
)
学习率动态2
def step_decay(epoch):
initial_lrate = 0.01
drop = 0.5
epochs_drop = 10.0
lrate = initial_lrate * math.pow(drop,math.floor((1+epoch)/epochs_drop))
return lrate
lrate = LearningRateScheduler(step_decay)
sgd = SGD(lr=0.0, momentum=0.9, decay=0.0, nesterov=False)
model.fit(train_set_x, train_set_y, validation_split=0.1, nb_epoch=200, batch_size=256, callbacks=[lrate])
1
2
3
4
5
6
7
8
9
10
具体可以参考这篇文章Using Learning Rate Schedules for Deep Learning Models in Python with Keras
如何记录每一次epoch的训练/验证损失/准确度?
Model.fit函数会返回一个 History 回调,该回调有一个属性history包含一个封装有连续损失/准确的lists。代码如下:
hist = model.fit(X, y,validation_split=0.2)
print(hist.history)
1
2
Keras输出的loss,val这些值如何保存到文本中去
Keras中的fit函数会返回一个History对象,它的History.history属性会把之前的那些值全保存在里面,如果有验证集的话,也包含了验证集的这些指标变化情况,具体写法
hist=model.fit(train_set_x,train_set_y,batch_size=256,shuffle=True,nb_epoch=nb_epoch,validation_split=0.1)
with open(‘log_sgd_big_32.txt’,‘w’) as f:
f.write(str(hist.history))
1
2
3
示例,多个回调函数用逗号隔开
checkpointer = ModelCheckpoint(filepath="./checkpoint.hdf5", verbose=1)
lrate = ReduceLROnPlateau(min_lr=0.00001)
answer.compile(optimizer=‘rmsprop’, loss=‘categorical_crossentropy’,
metrics=[‘accuracy’])
answer.fit(
[inputs_train, queries_train, inputs_train], answers_train,
batch_size=32,
nb_epoch=5000,
validation_data=([inputs_test, queries_test, inputs_test], answers_test),
callbacks=[checkpointer, lrate]
)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
keras回调函数中的Tensorboard
keras.callbacks.TensorBoard(log_dir=’./Graph’, histogram_freq=0,
write_graph=True, write_images=True)
1
2
3
tbCallBack = keras.callbacks.TensorBoard(log_dir=’./Graph’, histogram_freq=0, write_graph=True, write_images=True)
…
model.fit(…inputs and parameters…, callbacks=[tbCallBack])
1
2
3
tensorboard --logdir path_to_current_dir/Graph
1
或者
from keras.callbacks import TensorBoard
tensorboard = TensorBoard(log_dir=’./logs’, histogram_freq=0,
write_graph=True, write_images=False)
model.fit(X_train, Y_train,
batch_size=batch_size,
epochs=nb_epoch,
validation_data=(X_test, Y_test),
shuffle=True,
callbacks=[tensorboard])
1
2
3
4
5
6
7
8
9
10
11
https://stackoverflow.com/questions/42112260/how-do-i-use-the-tensorboard-callback-of-keras
参考文献
http://blog.csdn.net/u010159842/article/details/54602217
用Keras搞一个阅读理解机器人
回调函数Callbacks
Keras中文文档
深度学习框架Keras使用心得