PyTorch的lr_scheduler:AttributeError: ‘ReduceLROnPlateau‘ object has no attribute ‘get_last_lr‘

在PyTorch中使用ReduceLROnPlateau调整学习率时,如果想获取当前的学习率,调用scheduler.get_last_lr(),会遇到以下报错:

scheduler.get_last_lr()

AttributeError: ‘ReduceLROnPlateau’ object has no attribute ‘get_last_lr’

scheduler.get_lr()

AttributeError: ‘ReduceLROnPlateau’ object has no attribute ‘get_lr’

解决方案是使用:

optimizer.state_dict()['param_groups'][0]['lr']

具体看下面的代码:

import torch
from torchvision.models import AlexNet
from torch import optim
import matplotlib.pyplot as plt
import math

model = AlexNet(num_classes=2)

# optimizer
# optimizer = torch.optim.Adam(model.parameters(),lr=1e-5)
optimizer = torch.optim.SGD(model.parameters(), lr=0.01, momentum=0.9, nesterov=False)

# scheduler
scheduler = optim.lr_scheduler.ReduceLROnPlateau(optimizer, mode='min', factor=0.1, patience=5, threshold=0.0001, threshold_mode='rel', cooldown=0, min_lr=0, eps=1e-08, verbose=False)

EPOCHS = 100
y = []
val_loss = 10
for epoch in range(1,EPOCHS+1):
    optimizer.zero_grad()
    optimizer.step()
#     print("第%d个epoch的学习率:%f" % (epoch,optimizer.param_groups[0]['lr']))

    # 如果固定val_loss,则lr不更新
    # val_loss *= 0.1
    
    if epoch%10 == 0:
        val_loss *= 0.1
    # y.append(scheduler.get_last_lr()[0]) # 'ReduceLROnPlateau' object has no attribute 'get_last_lr'
    # y.append(scheduler.get_lr()[0]) # 'ReduceLROnPlateau' object has no attribute 'get_lr'
    y.append(optimizer.state_dict()['param_groups'][0]['lr'])
    scheduler.step(val_loss)

# 画出lr的变化    
x = list(range(EPOCHS))
plt.figure()
plt.plot(x, y)
plt.xlabel("epoch")
plt.ylabel("lr")
plt.title("ReduceLROnPlateau")
plt.show()

你可能感兴趣的:(Error,pytorch,深度学习)