1 Star 0 Fork 0

gisleung/py_pytorch_learn

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
文件
该仓库未声明开源许可证文件(LICENSE),使用请关注具体项目描述及其代码上游依赖。
克隆/下载
017 nn_optim.py 1.96 KB
一键复制 编辑 原始数据 按行查看 历史
gisleung 提交于 2022-04-01 20:26 . 复习完毕
"""
将 优化器 用于网络训练(调优)
torch.optim.SGD :
"""
import torch
import torchvision
from torch import nn
from torch.nn import Conv2d, MaxPool2d, ReLU, Sigmoid, Linear, Flatten, Sequential
from torch.utils.data import DataLoader
from torch.utils.tensorboard import SummaryWriter
writer = SummaryWriter("logs/017")
dataset = torchvision.datasets.CIFAR10(root="./visionData", train=False, transform=torchvision.transforms.ToTensor(),
download=True)
dataloader = DataLoader(dataset, batch_size=1)
# Sequential。使代码更加简洁
class MyNnSe(nn.Module):
def __init__(self):
super(MyNnSe, self).__init__()
self.model1 = Sequential(
Conv2d(3, 32, 5, padding=2),
MaxPool2d(2),
Conv2d(32, 32, 5, padding=2),
MaxPool2d(2),
Conv2d(32, 64, 5, padding=2),
MaxPool2d(2),
Flatten(),
Linear(1024, 64),
Linear(64, 10)
)
def forward(self, x):
x = self.model1(x)
return x
mySeNn = MyNnSe()
loss = nn.CrossEntropyLoss()
# 构造优化器
optim = torch.optim.SGD(mySeNn.parameters(), lr=0.01)
'''
训练套路分三步:https://blog.csdn.net/weixin_45072810/article/details/109687210
# 先将梯度归零 optimizer.zero_grad()
# 然后反向传播计算得到每个参数的梯度 loss.backward()
# 最后通过梯度下降执行一步参数更新 optimizer.step()
'''
# 循环训练20次,打印输出每一轮的(总)误差
for epoch in range(20):
running_loss = 0.0
for data in dataloader:
imgs, tagets = data
outputs = mySeNn(imgs)
result_loss = loss(outputs, tagets)
optim.zero_grad() # 先将梯度归零
result_loss.backward() # 计算梯度,用于优化器反向调节
optim.step() # 调优
running_loss = running_loss + result_loss
print(running_loss)
writer.close()
马建仓 AI 助手
尝试更多
代码解读
代码找茬
代码优化
1
https://gitee.com/gisleung/py_pytorch_learn.git
git@gitee.com:gisleung/py_pytorch_learn.git
gisleung
py_pytorch_learn
py_pytorch_learn
master

搜索帮助

0d507c66 1850385 C8b1a773 1850385