深度学习:神经网络的参数更新过程

Author:龙箬
Computer Application Technology
Change the World with Data and Artificial Intelligence !
CSDN@weixin_43975035
人类从历史中学到的唯一的教训,就是人类没有从历史中吸取任何教训。

  • GitHub链接
    深度学习:神经网络的参数更新过程_第1张图片
    深度学习:神经网络的参数更新过程_第2张图片
    深度学习:神经网络的参数更新过程_第3张图片
    代码如下:
import torch
import torch.nn.functional as F

# 定义输入样本数值
X1 = torch.tensor(1)
X2 = torch.tensor(2)
Y = torch.tensor(2)
Lr = 0.01

# 参数初始化
w1_11 = torch.tensor(1.,requires_grad=True)
w1_12 = torch.tensor(1.,requires_grad=True)
w1_21 = torch.tensor(1.,requires_grad=True)
w1_22 = torch.tensor(-1.,requires_grad=True)
w2_11 = torch.tensor(1.,requires_grad=True)
w2_21 = torch.tensor(-1.,requires_grad=True)

1_11, 1_12, 2_11 = torch.tensor([1.,1.,1.],requires_grad=True)

# 前向传播
y1_1 = w1_11*X1 + w1_21*X2 + 1_11
x2_1 = F.relu(y1_1)
y1_2 = w1_12*X1 + w1_22*X2 + 1_12
x2_2 = F.relu(y1_2)
A = w2_11*x2_1 + w2_21*x2_2 +2_11

# Loss
Loss = (A-Y).pow(2)/2

# 参数梯度
w1_11_grad = (A-Y)*w2_11*1*X1
print(w1_11_grad)

# 更新参数
w1_11 = w1_11 - Lr*w1_11_grad
print(w1_11)
import torch
import torch.nn.functional as F

# 定义输入样本数值
X1 = torch.tensor(1)
X2 = torch.tensor(2)
Y = torch.tensor(2)
Lr = 0.01

# 参数初始化
w1_11 = torch.tensor(1.,requires_grad=True)
w1_12 = torch.tensor(1.,requires_grad=True)
w1_21 = torch.tensor(1.,requires_grad=True)
w1_22 = torch.tensor(-1.,requires_grad=True)
w2_11 = torch.tensor(1.,requires_grad=True)
w2_21 = torch.tensor(-1.,requires_grad=True)

1_11, 1_12, 2_11 = torch.tensor([1.,1.,1.],requires_grad=True)

# 前向传播
y1_1 = w1_11*X1 + w1_21*X2 + 1_11
x2_1 = F.relu(y1_1)
y1_2 = w1_12*X1 + w1_22*X2 + 1_12
x2_2 = F.relu(y1_2)
A = w2_11*x2_1 + w2_21*x2_2 +2_11

# Loss
Loss = (A-Y).pow(2)/2

Loss.backward()
print(w1_11.grad)

with torch.no_grad():
    w1_11 -= Lr*w1_11.grad

print(w1_11)

执行结果如下:
深度学习:神经网络的参数更新过程_第4张图片

如有侵权,请联系侵删
需要本实验源数据及代码的小伙伴请联系QQ:2225872659

你可能感兴趣的:(深度学习,深度学习,神经网络,人工智能,python,机器学习)