pytorch分层学习率设置

optimizer = torch.optim.SGD(model.parameters(), lr=learning_rate,momentum=0.9, nesterov=True,weight_decay= 1e-5)
    down2_params = list(model.down2.parameters())  # 获取down2层的参数
    down3_params = list(model.down3.parameters())  # 获取down3层的参数
    down4_params = list(model.down4.parameters())  # 获取down4层的参数

    optimizer.param_groups.append({'params': down2_params, 'lr': learning_rate / 100,'weight_decay': 0,'momentum': 0,'dampening':0,'nesterov':True})  # 设置down2层的学习率
    optimizer.param_groups.append({'params': down3_params, 'lr': learning_rate / 100,'weight_decay': 0,'momentum': 0,'dampening':0,'nesterov':True})  # 设置down3层的学习率
    optimizer.param_groups.append({'params': down4_params, 'lr': learning_rate / 100,'weight_decay': 0,'momentum': 0,'dampening':0,'nesterov':True})  

你可能感兴趣的:(CV个人工具包,pytorch,学习,人工智能,神经网络,深度学习)