PyTorch深度学习实践——用pytorch实现logistic regression(分类问题)

参考资料

参考资料1:https://blog.csdn.net/bit452/article/details/109680909
参考资料2:http://biranda.top/Pytorch%E5%AD%A6%E4%B9%A0%E7%AC%94%E8%AE%B0007%E2%80%94%E2%80%94%E5%88%86%E7%B1%BB%E9%97%AE%E9%A2%98/#%E9%97%AE%E9%A2%98%E5%BC%95%E5%85%A5

用pytorch实现logistic回归

PyTorch深度学习实践——用pytorch实现logistic regression(分类问题)_第1张图片
代码:

import torch

x_data = torch.Tensor([[1.0],
                       [2.0],
                       [3.0]])
y_data = torch.Tensor([[0],
                       [0],
                       [1]])

#改用LogisticRegressionModel 同样继承于Module
class LogisticRegressionModel(torch.nn.Module):
    def __init__(self):
        super(LogisticRegressionModel, self).__init__()
        self.linear = torch.nn.Linear(1,1)

    def forward(self, x):
        #对原先的linear结果进行sigmod激活
        y_pred = torch.sigmoid(self.linear(x))
        '''
        变化1,torch.sigmoid
        '''
        return y_pred
model = LogisticRegressionModel()

#构造的criterion对象所接受的参数为(y',y) 改用BCE,Binary CrossEntropy,二进制交叉熵
criterion = torch.nn.BCELoss(reduction='sum')
optimizer = torch.optim.SGD(model.parameters(), lr=0.01)
'''
变化2,torch.nn.BCELoss 二进制交叉熵损失函数
'''

for epoch in range(1000):
    y_pred = model(x_data)
    loss = criterion(y_pred,y_data)
    print(epoch,loss.item())
    optimizer.zero_grad()
    loss.backward()
    optimizer.step()

print('w = ', model.linear.weight.item())
print('b = ', model.linear.bias.item())

x_test = torch.Tensor([[4.0]])
y_test = model(x_test)

print('y_pred = ',y_test.data)

结果:
最后输入 4 进行预测,得到0.8737,相当于 1 ,相当于属于1这个分类。
PyTorch深度学习实践——用pytorch实现logistic regression(分类问题)_第2张图片

你可能感兴趣的:(PyTorch,深度学习实践,深度学习,pytorch)