【Pytorch神经网络】训练过程

文章目录

    • 单次训练过程
    • 循环训练

单次训练过程

network = Network()

train_loader = torch.utils.data.DataLoader(train_set, batch_size=100)
optimizer = optim.Adam(network.parameters(), lr=0.01)

# 1. 从训练集中获取批次
batch = next(iter(train_loader))
images, labels = batch

# 2. 将批次传递给网络
preds = network(images)
# 3. 计算损失
loss = F.cross_entropy(preds, labels)

# 4. 反向传播计算梯度
loss.backward()  
# 5. 使用梯度更新权重以降低损失
optimizer.step()

#--------------------------------------------------
print(f"loss1: {loss.item()}")
preds = network(images)
loss = F.cross_entropy(preds, labels)
print(f"loss2: {loss.item()}")

# loss1: 2.2933034896850586
# loss2: 2.2825779914855957

循环训练

def get_num_correct(preds, labels):
    return preds.argmax(dim=1).eq(labels).sum().item()

循环训练过程

network = Network()

train_loader = torch.utils.data.DataLoader(train_set, batch_size=100)
optimizer = optim.Adam(network.parameters(), lr=0.01)


for epoch in range(5):
    total_loss = 0
    total_correct = 0

    for batch in train_loader:
        images, labels = batch

        preds = network(images)
        loss = F.cross_entropy(preds, labels)

        optimizer.zero_grad()
        loss.backward()  # 反向传播计算梯度
        optimizer.step() # 更新权重

        total_loss += loss
        total_correct += get_num_correct(preds, labels)
    
    print(f"epoch: {epoch}, total_correct:{total_correct}, total_loss: {total_loss}")

你可能感兴趣的:(AI,神经网络,pytorch,深度学习)