学习率设置-warm up与consine learning rate

warm up最早来自于这篇文章:https://arxiv.org/pdf/1706.02677.pdf 。根据这篇文章,我们一般只在前5个epoch使用warm up。consine learning rate来自于这篇文章:https://arxiv.org/pdf/1812.01187.pdf 。通常情况下,把warm up和consine learning rate一起使用会达到更好的效果。 代码实现:

# MultiStepLR without warm up
    scheduler = torch.optim.lr_scheduler.MultiStepLR(optimizer, milestones=args.milestones, gamma=0.1)

    # warm_up_with_multistep_lr
    warm_up_with_multistep_lr = lambda epoch: epoch / args.warm_up_epochs if epoch <= args.warm_up_epochs else 0.1**len([m for m in args.milestones if m <= epoch])
    scheduler = torch.optim.lr_scheduler.LambdaLR(optimizer, lr_lambda=warm_up_with_multistep_lr)

    # warm_up_with_cosine_lr
    warm_up_with_cosine_lr = lambda epoch: epoch / args.warm_up_epochs if epoch <= args.warm_up_epochs else 0.5 * ( math.cos((epoch - args.warm_up_epochs) /(args.epochs - args.warm_up_epochs) * math.pi) + 1)
    scheduler = torch.optim.lr_scheduler.LambdaLR( optimizer, lr_lambda=warm_up_with_cosine_lr)

上面的三段代码分别是不使用warm up+multistep learning rate 衰减、使用warm up+multistep learning rate 衰减、使用warm up+consine learning rate衰减。代码均使用pytorch中的lr_scheduler.LambdaLR自定义学习率衰减器。

参考自:https://zhuanlan.zhihu.com/p/148487894

你可能感兴趣的:(神经网络,深度学习,pytorch,自动驾驶)