PyTorch笔记(一):第一个PyTorch程序

PyTorch笔记(一):第一个PyTorch程序

  • 1.经典线性函数
  • 2.代码实现
  • 3.线性回归结果

1.经典线性函数

先创建一些随机训练样本,使其符合经典函数 Y = M T X + b Y=M^TX+b Y=MTX+b分布,再增加一点噪声处理使得样本出现一定的偏差。然后创建一个线性回归的模型,在训练过程中对训练样本进行反向传播,求导后根据指定的损失边界结束训练。

2.代码实现

#!~/anaconda3/bin/python
# _*_ coding:UTF-8 _*_
from __future__ import print_function
from itertools import count

import numpy as np
import torch
import torch.autograd

import torch.nn.functional as F
from torch.autograd import Variable
import matplotlib.pyplot as plt

random_state = 5000
torch.manual_seed(random_state)
poly_degree = 4
W_target = torch.randn(poly_degree, 1) * 5
b_target = torch.randn(1) * 5


def make_features(x):
    """
    创建一个特征矩阵结构为[x, x^2, x^3, x^4].
    """
    x = x.unsqueeze(1)
    return torch.cat([x ** i for i in range(1, poly_degree + 1)], 1)


def f(x):
    """近似函数"""
    return x.mm(W_target) + b_target[0]


def poly_desc(W, b):
    """生成多项式描述内容"""
    result = 'y = '
    for i, w in enumerate(W):
        result += '{:+.2f} x^{} '.format(w, len(W) - i)
    result += '{:+.2f}'.format(b[0])
    return result


def get_batch(batch_size=32):
    """创建类似(x, f(x))批数据"""
    random = torch.from_numpy(np.sort(torch.randn(batch_size)))
    x = make_features(random)
    y = f(x)
    return Variable(x), Variable(y)


# 声明模型
fc = torch.nn.Linear(W_target.size(0), 1)

for batch_idx in count(1):
    # 获取数据
    batch_x, batch_y = get_batch()
    # 重制求导
    fc.zero_grad()

    # 前向传播
    output = F.smooth_l1_loss(fc(batch_x), batch_y)
    loss = output.item()

    # 后向传播
    output.backward()

    # 应用导数
    for param in fc.parameters():
        param.data.add_(-0.1 * param.grad.data)

    # 停止条件
    if loss < 1e-3:
        plt.cla()
        plt.scatter(batch_x.data.numpy()[:, 0], batch_y.data.numpy()[:, 0], label='real curve', color='b')
        plt.plot(batch_x.data.numpy()[:, 0], fc(batch_x).data.numpy()[:, 0], label='fitting curve', color='r')
        plt.title('$Y=W^T*X+b$')
        plt.legend()
        plt.savefig('1.png')
        plt.show()
        break

print('Loss: {:.6f} after {} batches'.format(loss, batch_idx))
print('==> Learned function:\t' + poly_desc(fc.weight.data.view(-1), fc.bias.data))
print('==> Actual function:\t' + poly_desc(W_target.view(-1), b_target))

3.线性回归结果

PyTorch笔记(一):第一个PyTorch程序_第1张图片

你可能感兴趣的:(PyTorch笔记)