PyTorch CNN实战之MNIST手写数字识别示例


Posted in Python onMay 29, 2018

简介

卷积神经网络(Convolutional Neural Network, CNN)是深度学习技术中极具代表的网络结构之一,在图像处理领域取得了很大的成功,在国际标准的ImageNet数据集上,许多成功的模型都是基于CNN的。

卷积神经网络CNN的结构一般包含这几个层:

  1. 输入层:用于数据的输入
  2. 卷积层:使用卷积核进行特征提取和特征映射
  3. 激励层:由于卷积也是一种线性运算,因此需要增加非线性映射
  4. 池化层:进行下采样,对特征图稀疏处理,减少数据运算量。
  5. 全连接层:通常在CNN的尾部进行重新拟合,减少特征信息的损失
  6. 输出层:用于输出结果

PyTorch CNN实战之MNIST手写数字识别示例

PyTorch实战

本文选用上篇的数据集MNIST手写数字识别实践CNN。

import torch
import torch.nn as nn
import torch.nn.functional as F
import torch.optim as optim
from torchvision import datasets, transforms
from torch.autograd import Variable

# Training settings
batch_size = 64

# MNIST Dataset
train_dataset = datasets.MNIST(root='./data/',
                train=True,
                transform=transforms.ToTensor(),
                download=True)

test_dataset = datasets.MNIST(root='./data/',
               train=False,
               transform=transforms.ToTensor())

# Data Loader (Input Pipeline)
train_loader = torch.utils.data.DataLoader(dataset=train_dataset,
                      batch_size=batch_size,
                      shuffle=True)

test_loader = torch.utils.data.DataLoader(dataset=test_dataset,
                     batch_size=batch_size,
                     shuffle=False)


class Net(nn.Module):
  def __init__(self):
    super(Net, self).__init__()
    # 输入1通道,输出10通道,kernel 5*5
    self.conv1 = nn.Conv2d(1, 10, kernel_size=5)
    self.conv2 = nn.Conv2d(10, 20, kernel_size=5)
    self.mp = nn.MaxPool2d(2)
    # fully connect
    self.fc = nn.Linear(320, 10)

  def forward(self, x):
    # in_size = 64
    in_size = x.size(0) # one batch
    # x: 64*10*12*12
    x = F.relu(self.mp(self.conv1(x)))
    # x: 64*20*4*4
    x = F.relu(self.mp(self.conv2(x)))
    # x: 64*320
    x = x.view(in_size, -1) # flatten the tensor
    # x: 64*10
    x = self.fc(x)
    return F.log_softmax(x)


model = Net()

optimizer = optim.SGD(model.parameters(), lr=0.01, momentum=0.5)

def train(epoch):
  for batch_idx, (data, target) in enumerate(train_loader):
    data, target = Variable(data), Variable(target)
    optimizer.zero_grad()
    output = model(data)
    loss = F.nll_loss(output, target)
    loss.backward()
    optimizer.step()
    if batch_idx % 200 == 0:
      print('Train Epoch: {} [{}/{} ({:.0f}%)]\tLoss: {:.6f}'.format(
        epoch, batch_idx * len(data), len(train_loader.dataset),
        100. * batch_idx / len(train_loader), loss.data[0]))


def test():
  test_loss = 0
  correct = 0
  for data, target in test_loader:
    data, target = Variable(data, volatile=True), Variable(target)
    output = model(data)
    # sum up batch loss
    test_loss += F.nll_loss(output, target, size_average=False).data[0]
    # get the index of the max log-probability
    pred = output.data.max(1, keepdim=True)[1]
    correct += pred.eq(target.data.view_as(pred)).cpu().sum()

  test_loss /= len(test_loader.dataset)
  print('\nTest set: Average loss: {:.4f}, Accuracy: {}/{} ({:.0f}%)\n'.format(
    test_loss, correct, len(test_loader.dataset),
    100. * correct / len(test_loader.dataset)))


for epoch in range(1, 10):
  train(epoch)
  test()

输出结果:

Train Epoch: 1 [0/60000 (0%)]   Loss: 2.315724
Train Epoch: 1 [12800/60000 (21%)]  Loss: 1.931551
Train Epoch: 1 [25600/60000 (43%)]  Loss: 0.733935
Train Epoch: 1 [38400/60000 (64%)]  Loss: 0.165043
Train Epoch: 1 [51200/60000 (85%)]  Loss: 0.235188

Test set: Average loss: 0.1935, Accuracy: 9421/10000 (94%)

Train Epoch: 2 [0/60000 (0%)]   Loss: 0.333513
Train Epoch: 2 [12800/60000 (21%)]  Loss: 0.163156
Train Epoch: 2 [25600/60000 (43%)]  Loss: 0.213840
Train Epoch: 2 [38400/60000 (64%)]  Loss: 0.141114
Train Epoch: 2 [51200/60000 (85%)]  Loss: 0.128191

Test set: Average loss: 0.1180, Accuracy: 9645/10000 (96%)

Train Epoch: 3 [0/60000 (0%)]   Loss: 0.206469
Train Epoch: 3 [12800/60000 (21%)]  Loss: 0.234443
Train Epoch: 3 [25600/60000 (43%)]  Loss: 0.061048
Train Epoch: 3 [38400/60000 (64%)]  Loss: 0.192217
Train Epoch: 3 [51200/60000 (85%)]  Loss: 0.089190

Test set: Average loss: 0.0938, Accuracy: 9723/10000 (97%)

Train Epoch: 4 [0/60000 (0%)]   Loss: 0.086325
Train Epoch: 4 [12800/60000 (21%)]  Loss: 0.117741
Train Epoch: 4 [25600/60000 (43%)]  Loss: 0.188178
Train Epoch: 4 [38400/60000 (64%)]  Loss: 0.049807
Train Epoch: 4 [51200/60000 (85%)]  Loss: 0.174097

Test set: Average loss: 0.0743, Accuracy: 9767/10000 (98%)

Train Epoch: 5 [0/60000 (0%)]   Loss: 0.063171
Train Epoch: 5 [12800/60000 (21%)]  Loss: 0.061265
Train Epoch: 5 [25600/60000 (43%)]  Loss: 0.103549
Train Epoch: 5 [38400/60000 (64%)]  Loss: 0.019137
Train Epoch: 5 [51200/60000 (85%)]  Loss: 0.067103

Test set: Average loss: 0.0720, Accuracy: 9781/10000 (98%)

Train Epoch: 6 [0/60000 (0%)]   Loss: 0.069251
Train Epoch: 6 [12800/60000 (21%)]  Loss: 0.075502
Train Epoch: 6 [25600/60000 (43%)]  Loss: 0.052337
Train Epoch: 6 [38400/60000 (64%)]  Loss: 0.015375
Train Epoch: 6 [51200/60000 (85%)]  Loss: 0.028996

Test set: Average loss: 0.0694, Accuracy: 9783/10000 (98%)

Train Epoch: 7 [0/60000 (0%)]   Loss: 0.171613
Train Epoch: 7 [12800/60000 (21%)]  Loss: 0.078520
Train Epoch: 7 [25600/60000 (43%)]  Loss: 0.149186
Train Epoch: 7 [38400/60000 (64%)]  Loss: 0.026692
Train Epoch: 7 [51200/60000 (85%)]  Loss: 0.108824

Test set: Average loss: 0.0672, Accuracy: 9793/10000 (98%)

Train Epoch: 8 [0/60000 (0%)]   Loss: 0.029188
Train Epoch: 8 [12800/60000 (21%)]  Loss: 0.031202
Train Epoch: 8 [25600/60000 (43%)]  Loss: 0.194858
Train Epoch: 8 [38400/60000 (64%)]  Loss: 0.051497
Train Epoch: 8 [51200/60000 (85%)]  Loss: 0.024832

Test set: Average loss: 0.0535, Accuracy: 9837/10000 (98%)

Train Epoch: 9 [0/60000 (0%)]   Loss: 0.026706
Train Epoch: 9 [12800/60000 (21%)]  Loss: 0.057807
Train Epoch: 9 [25600/60000 (43%)]  Loss: 0.065225
Train Epoch: 9 [38400/60000 (64%)]  Loss: 0.037004
Train Epoch: 9 [51200/60000 (85%)]  Loss: 0.057822

Test set: Average loss: 0.0538, Accuracy: 9829/10000 (98%)

Process finished with exit code 0

参考:https://github.com/hunkim/PyTorchZeroToAll

以上就是本文的全部内容,希望对大家的学习有所帮助,也希望大家多多支持三水点靠木。

Python 相关文章推荐
python BeautifulSoup使用方法详解
Nov 21 Python
python回调函数的使用方法
Jan 23 Python
按日期打印Python的Tornado框架中的日志的方法
May 02 Python
python 根据pid杀死相应进程的方法
Jan 16 Python
python3.6连接MySQL和表的创建与删除实例代码
Dec 28 Python
Python设计模式之观察者模式简单示例
Jan 10 Python
基于pandas数据样本行列选取的方法
Apr 20 Python
Python通过属性手段实现只允许调用一次的示例讲解
Apr 21 Python
python opencv实现gif图片分解的示例代码
Dec 13 Python
Python程序控制语句用法实例分析
Jan 14 Python
python实现tail -f 功能
Jan 17 Python
python 删除excel表格重复行,数据预处理操作
Jul 06 Python
Python根据指定日期计算后n天,前n天是哪一天的方法
May 29 #Python
python 将md5转为16字节的方法
May 29 #Python
python 利用栈和队列模拟递归的过程
May 29 #Python
查看django执行的sql语句及消耗时间的两种方法
May 29 #Python
让Django支持Sql Server作后端数据库的方法
May 29 #Python
Django 浅谈根据配置生成SQL语句的问题
May 29 #Python
django表单实现下拉框的示例讲解
May 29 #Python
You might like
PHP初学者头疼问题总结
2006/07/08 PHP
php文件怎么打开 如何执行php文件
2011/12/21 PHP
php将csv文件导入到mysql数据库的方法
2014/12/24 PHP
JSON字符串传到后台PHP处理问题的解决方法
2016/06/05 PHP
thinkPHP+PHPExcel实现读取文件日期的方法(含时分秒)
2016/07/07 PHP
基于ThinkPHP5.0实现图片上传插件
2017/09/25 PHP
使用PHPStorm+XDebug搭建单步调试环境
2017/11/19 PHP
JS实多级联动下拉菜单类,简单实现省市区联动菜单!
2007/05/03 Javascript
JS实现窗口加载时模拟鼠标移动的方法
2015/06/03 Javascript
基于jQuery实现拖拽图标到回收站并删除功能
2015/11/25 Javascript
Vue.js 2.0中select级联下拉框实例
2017/03/06 Javascript
原生js封装运动框架的示例讲解
2017/10/01 Javascript
不使用JavaScript实现菜单的打开和关闭效果demo
2018/05/01 Javascript
react 国际化的实现代码示例
2018/09/14 Javascript
浅谈ng-zorro使用心得
2018/12/03 Javascript
js使用cookie实现记住用户名功能示例
2019/06/13 Javascript
17道题让你彻底理解JS中的类型转换
2019/08/08 Javascript
js实现简单页面全屏
2019/09/17 Javascript
python标准算法实现数组全排列的方法
2015/03/17 Python
Eclipse和PyDev搭建完美Python开发环境教程(Windows篇)
2016/11/16 Python
python中类和实例如何绑定属性与方法示例详解
2017/08/18 Python
Scrapy框架CrawlSpiders的介绍以及使用详解
2017/11/29 Python
python如何生成各种随机分布图
2018/08/27 Python
在pytorch中为Module和Tensor指定GPU的例子
2019/08/19 Python
Python scrapy增量爬取实例及实现过程解析
2019/12/24 Python
Python作用域与名字空间原理详解
2020/03/21 Python
Python存储读取HDF5文件代码解析
2020/11/25 Python
解决pycharm修改代码后第一次运行不生效的问题
2021/02/06 Python
HTML5+CSS3网页加载进度条的实现,下载进度条的代码实例
2016/12/30 HTML / CSS
教师年终个人自我评价
2013/10/04 职场文书
2013年保送生自荐信格式
2013/11/20 职场文书
绩效专员岗位职责
2013/12/02 职场文书
《望洞庭》教学反思
2014/02/16 职场文书
餐饮商业计划书范文
2014/04/29 职场文书
2019年大学生学年自我鉴定!
2019/03/25 职场文书
Python数据处理的三个实用技巧分享
2022/04/01 Python