Pytorch约束可训练参数的范围

1. 在训练train的时候,对参数的范围进行限制

out = net(frame)
loss = F.mse_loss(out, label_onehot)
loss.backward()
optimizer.step()
for p in net.parameters():
    p.data.clamp_(0, 99)

2. 通过新建类的方法,对参数的范围进行限制

新建类

class weightConstraint(object):
    def __init__(self):
        pass
    
    def __call__(self,module):
        if hasattr(module,'weight'):
            w=module.weight.data
            w=w.clamp(-1,1) #将参数范围限制到-1-1之间
            module.weight.data=w

实例化类

# Applying the constraints to only the last layer
constraints=weightConstraint()
......
out = net(frame)
loss = F.mse_loss(out, label_onehot)
loss.backward()
optimizer.step()
#对权重进行限制
model._modules['l2'].apply(constraints)

3. 获得模型中可训练参数

for name, p in net.named_parameters():
    if p.requires_grad:
        print(name)

你可能感兴趣的:(AI,深度学习,python,pytorch)