1 Star 0 Fork 0

gisleung/py_pytorch_learn

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
文件
该仓库未声明开源许可证文件(LICENSE),使用请关注具体项目描述及其代码上游依赖。
克隆/下载
016 nn_loss_network.py 1.35 KB
一键复制 编辑 原始数据 按行查看 历史
gisleung 提交于 2022-04-01 20:26 . 复习完毕
"""
在神经网络中加入 损失值。
"""
import torchvision
from torch import nn
from torch.nn import Conv2d, MaxPool2d, ReLU, Sigmoid, Linear, Flatten, Sequential
from torch.utils.data import DataLoader
from torch.utils.tensorboard import SummaryWriter
writer = SummaryWriter("logs/016")
dataset = torchvision.datasets.CIFAR10(root="./visionData", train=False, transform=torchvision.transforms.ToTensor(),
download=True)
dataloader = DataLoader(dataset, batch_size=1)
# Sequential。使代码更加简洁
class MyNnSe(nn.Module):
def __init__(self):
super(MyNnSe, self).__init__()
self.model1 = Sequential(
Conv2d(3, 32, 5, padding=2),
MaxPool2d(2),
Conv2d(32, 32, 5, padding=2),
MaxPool2d(2),
Conv2d(32, 64, 5, padding=2),
MaxPool2d(2),
Flatten(),
Linear(1024, 64),
Linear(64, 10)
)
def forward(self, x):
x = self.model1(x)
return x
loss = nn.CrossEntropyLoss()
mySeNn = MyNnSe()
for data in dataloader:
imgs, tagets = data
outputs = mySeNn(imgs)
result_loss = loss(outputs, tagets)
# 计算梯度,用于优化器(optim)反向调节。(调节网络结构中的卷积权重)
result_loss.backward()
print(result_loss)
writer.close()
马建仓 AI 助手
尝试更多
代码解读
代码找茬
代码优化
1
https://gitee.com/gisleung/py_pytorch_learn.git
git@gitee.com:gisleung/py_pytorch_learn.git
gisleung
py_pytorch_learn
py_pytorch_learn
master

搜索帮助

0d507c66 1850385 C8b1a773 1850385