在pytorch中迭代修改学习率lr的大小

optimizer = torch.optim.Adam(model.parameters(), lr=0.01)
for epoch in range(1, args.epochs + 1):
    if(epoch>1):
        lr=lr/1.5
    for param_group in optimizer.param_groups:
        param_group['lr'] = lr
    train_epoch(epoch, args, model, device, train_loader, optimizer,Loss_list)

你可能感兴趣的:(在pytorch中迭代修改学习率lr的大小)