动态调整学习率

学习率对模型的学习效果影响很大。

  • 方法一,使用优化器的decay参数
model.compile(loss = 'sparse_categorical_crossentropy', optimizer = Adam(lr = 0.001, decay=1e-6), metrics = ['accuracy'])
  • 方法二,使用learningRateSchedule
import numpy as np
from keras.callbacks import LearningRateScheduler

def step_decay_schedule(initial_lr=1e-3, decay_factor=0.75, step_size=10):
    '''
    Wrapper function to create a LearningRateScheduler with step decay schedule.
    '''
    def schedule(epoch):
        return initial_lr * (decay_factor ** np.floor(epoch/step_size))
    
    return LearningRateScheduler(schedule)

lr_sched = step_decay_schedule(initial_lr=1e-4, decay_factor=0.75, step_size=2)
model.fit(X_train, Y_train, callbacks=[lr_sched])

【参考资料】
1.step_decay_schedule.py

你可能感兴趣的:(动态调整学习率)