paddle基础网络解析

本文涉及的产品
公共DNS(含HTTPDNS解析),每月1000万次HTTP解析
云解析 DNS,旗舰版 1个月
全局流量管理 GTM,标准版 1个月
简介: paddle基础网络解析

项目地址


基础网络搭建解析



20210305011421495.png


人类对图像的理解和计算机是完全不用的,计算机读取的数据是一串数字,所以理解起来会比较困难。


20210305012918846.png


语义鸿沟,对于计算机来说,像素和区域 极为相似或者迷惑性极强的有些人类不仔细看也容易看错更何况是机器呢?


image.png



对于解决这些问题还需要更加努力提高机器的理解和更好的ai技术


图像识别的流程


使用机器学习(深度学习)的目的:寻找一个合适的函数


20210305013919872.png


我们可以通过数据处理建立模型训练模型测试模型等一系类的过程得到一个比较好的结果


20210305014258189.png


神经元

神经元来源于人的神经网络

人神经细胞结构大致可分为:树突、突触、细胞体及轴突。

单个神经细胞可被视为一种只有两种状态的机器——激动时为‘是’,而未激动时为‘否’。

刚刚好可以对应我们计算机的0和1 大量的神经元 进行堆叠 刚好可以对应神经网络


20210305015109626.png


网络结构

网络结构分为线性,卷积等

在一个网络中上一层的输出和本层的输入必须对应

一般我们必须要考虑的是输入层和输出层至于中间的可以统一称为隐藏层


接下去我们进行举例:以输入(1,28,28)输出10为例子构建几个常见的网络

import paddle
import paddle.nn as nn
from paddle.nn import Linear
import paddle.nn.functional as F
from paddle.vision.transforms import ToTensor


# 单层线性网络
model=nn.Linear(in_features=1*28*28, out_features=10)  # 定义线性网络
paddle.summary(model, (1*28*28))
---------------------------------------------------------------------------
 Layer (type)       Input Shape          Output Shape         Param #    
===========================================================================
   Linear-3           [[784]]                [10]              7,850     
===========================================================================
Total params: 7,850
Trainable params: 7,850
Non-trainable params: 0
---------------------------------------------------------------------------
Input size (MB): 0.00
Forward/backward pass size (MB): 0.00
Params size (MB): 0.03
Estimated Total Size (MB): 0.03
---------------------------------------------------------------------------
{'total_params': 7850, 'trainable_params': 7850}
# DNN
class MyDNN(paddle.nn.Layer):
    def __init__(self):
        super(MyDNN,self).__init__()
        self.hidden1 = Linear(28,100)
        self.hidden2 = Linear(100,100)
        self.hidden3 = Linear(100,28)
        self.hidden4 = Linear(1*28*28,10)
    def forward(self,input):
        # print(input.shape)
        x = self.hidden1(input)
        x =F.relu(x)
        # print(x.shape)
        x = self.hidden2(x)
        x = F.relu(x)
        # print(x.shape)
        x = self.hidden3(x)
        x = F.relu(x)
        # print(x.shape)
        x = paddle.reshape(x, shape=[-1,1*28*28])  
        x = self.hidden4(x)
        y = F.softmax(x)
        # print(y.shape)
        return y
network = MyDNN()
paddle.summary(network, (1, 28, 28))  
---------------------------------------------------------------------------
 Layer (type)       Input Shape          Output Shape         Param #    
===========================================================================
   Linear-8        [[1, 28, 28]]         [1, 28, 100]          2,900     
   Linear-9        [[1, 28, 100]]        [1, 28, 100]         10,100     
   Linear-10       [[1, 28, 100]]        [1, 28, 28]           2,828     
   Linear-11         [[1, 784]]            [1, 10]             7,850     
===========================================================================
Total params: 23,678
Trainable params: 23,678
Non-trainable params: 0
---------------------------------------------------------------------------
Input size (MB): 0.00
Forward/backward pass size (MB): 0.05
Params size (MB): 0.09
Estimated Total Size (MB): 0.14
---------------------------------------------------------------------------
{'total_params': 23678, 'trainable_params': 23678}


CNN说明


参考资料


20210205112823341.png

1933c55f1aac081d9bd018bb1ee01f53.png

20210205122057507.png20210205123438373.png

network = nn.Sequential(
    nn.Conv2D(in_channels=1, out_channels=6, kernel_size=3, stride=1, padding=1),  # 卷积
    nn.ReLU(),  # 激活函数
    nn.MaxPool2D(kernel_size=2, stride=2),  # 最大池化
    nn.Conv2D(in_channels=6, out_channels=16, kernel_size=5, stride=1, padding=0),
    nn.ReLU(),
    nn.MaxPool2D(kernel_size=2, stride=2),
    nn.Flatten(),
    nn.Linear(in_features=400, out_features=120),  # 400 = 5x5x16,输入形状为32x32, 输入形状为28x28时调整为256
    nn.Linear(in_features=120, out_features=84),
    nn.Linear(in_features=84, out_features=10)
)
paddle.summary(network, (1, 1, 28, 28))
---------------------------------------------------------------------------
 Layer (type)       Input Shape          Output Shape         Param #    
===========================================================================
   Conv2D-12      [[1, 1, 28, 28]]      [1, 6, 28, 28]          60       
    ReLU-3        [[1, 6, 28, 28]]      [1, 6, 28, 28]           0       
  MaxPool2D-3     [[1, 6, 28, 28]]      [1, 6, 14, 14]           0       
   Conv2D-13      [[1, 6, 14, 14]]     [1, 16, 10, 10]         2,416     
    ReLU-4       [[1, 16, 10, 10]]     [1, 16, 10, 10]           0       
  MaxPool2D-4    [[1, 16, 10, 10]]      [1, 16, 5, 5]            0       
   Flatten-5      [[1, 16, 5, 5]]          [1, 400]              0       
   Linear-21         [[1, 400]]            [1, 120]           48,120     
   Linear-22         [[1, 120]]            [1, 84]            10,164     
   Linear-23         [[1, 84]]             [1, 10]              850      
===========================================================================
Total params: 61,610
Trainable params: 61,610
Non-trainable params: 0
---------------------------------------------------------------------------
Input size (MB): 0.00
Forward/backward pass size (MB): 0.11
Params size (MB): 0.24
Estimated Total Size (MB): 0.35
---------------------------------------------------------------------------
{'total_params': 61610, 'trainable_params': 61610}
# api已有网络
network = paddle.vision.models.resnet101(num_classes=10)
paddle.summary(network, (1, 3, 224, 224))
-------------------------------------------------------------------------------
   Layer (type)         Input Shape          Output Shape         Param #    
===============================================================================
    Conv2D-123       [[1, 3, 224, 224]]   [1, 64, 112, 112]        9,408     
  BatchNorm2D-105   [[1, 64, 112, 112]]   [1, 64, 112, 112]         256      
      ReLU-43       [[1, 64, 112, 112]]   [1, 64, 112, 112]          0       
    MaxPool2D-8     [[1, 64, 112, 112]]    [1, 64, 56, 56]           0       
    Conv2D-125       [[1, 64, 56, 56]]     [1, 64, 56, 56]         4,096     
  BatchNorm2D-107    [[1, 64, 56, 56]]     [1, 64, 56, 56]          256      
      ReLU-44        [[1, 256, 56, 56]]    [1, 256, 56, 56]          0       
    Conv2D-126       [[1, 64, 56, 56]]     [1, 64, 56, 56]        36,864     
  BatchNorm2D-108    [[1, 64, 56, 56]]     [1, 64, 56, 56]          256      
    Conv2D-127       [[1, 64, 56, 56]]     [1, 256, 56, 56]       16,384     
  BatchNorm2D-109    [[1, 256, 56, 56]]    [1, 256, 56, 56]        1,024     
    Conv2D-124       [[1, 64, 56, 56]]     [1, 256, 56, 56]       16,384     
  BatchNorm2D-106    [[1, 256, 56, 56]]    [1, 256, 56, 56]        1,024     
BottleneckBlock-34   [[1, 64, 56, 56]]     [1, 256, 56, 56]          0       
    Conv2D-128       [[1, 256, 56, 56]]    [1, 64, 56, 56]        16,384     
  BatchNorm2D-110    [[1, 64, 56, 56]]     [1, 64, 56, 56]          256      
      ReLU-45        [[1, 256, 56, 56]]    [1, 256, 56, 56]          0       
    Conv2D-129       [[1, 64, 56, 56]]     [1, 64, 56, 56]        36,864     
  BatchNorm2D-111    [[1, 64, 56, 56]]     [1, 64, 56, 56]          256      
    Conv2D-130       [[1, 64, 56, 56]]     [1, 256, 56, 56]       16,384     
  BatchNorm2D-112    [[1, 256, 56, 56]]    [1, 256, 56, 56]        1,024     
BottleneckBlock-35   [[1, 256, 56, 56]]    [1, 256, 56, 56]          0       
    Conv2D-131       [[1, 256, 56, 56]]    [1, 64, 56, 56]        16,384     
  BatchNorm2D-113    [[1, 64, 56, 56]]     [1, 64, 56, 56]          256      
      ReLU-46        [[1, 256, 56, 56]]    [1, 256, 56, 56]          0       
    Conv2D-132       [[1, 64, 56, 56]]     [1, 64, 56, 56]        36,864     
  BatchNorm2D-114    [[1, 64, 56, 56]]     [1, 64, 56, 56]          256      
    Conv2D-133       [[1, 64, 56, 56]]     [1, 256, 56, 56]       16,384     
  BatchNorm2D-115    [[1, 256, 56, 56]]    [1, 256, 56, 56]        1,024     
BottleneckBlock-36   [[1, 256, 56, 56]]    [1, 256, 56, 56]          0       
    Conv2D-135       [[1, 256, 56, 56]]    [1, 128, 56, 56]       32,768     
  BatchNorm2D-117    [[1, 128, 56, 56]]    [1, 128, 56, 56]         512      
      ReLU-47        [[1, 512, 28, 28]]    [1, 512, 28, 28]          0       
    Conv2D-136       [[1, 128, 56, 56]]    [1, 128, 28, 28]       147,456    
  BatchNorm2D-118    [[1, 128, 28, 28]]    [1, 128, 28, 28]         512      
    Conv2D-137       [[1, 128, 28, 28]]    [1, 512, 28, 28]       65,536     
  BatchNorm2D-119    [[1, 512, 28, 28]]    [1, 512, 28, 28]        2,048     
    Conv2D-134       [[1, 256, 56, 56]]    [1, 512, 28, 28]       131,072    
  BatchNorm2D-116    [[1, 512, 28, 28]]    [1, 512, 28, 28]        2,048     
BottleneckBlock-37   [[1, 256, 56, 56]]    [1, 512, 28, 28]          0       
    Conv2D-138       [[1, 512, 28, 28]]    [1, 128, 28, 28]       65,536     
  BatchNorm2D-120    [[1, 128, 28, 28]]    [1, 128, 28, 28]         512      
      ReLU-48        [[1, 512, 28, 28]]    [1, 512, 28, 28]          0       
    Conv2D-139       [[1, 128, 28, 28]]    [1, 128, 28, 28]       147,456    
  BatchNorm2D-121    [[1, 128, 28, 28]]    [1, 128, 28, 28]         512      
    Conv2D-140       [[1, 128, 28, 28]]    [1, 512, 28, 28]       65,536     
  BatchNorm2D-122    [[1, 512, 28, 28]]    [1, 512, 28, 28]        2,048     
BottleneckBlock-38   [[1, 512, 28, 28]]    [1, 512, 28, 28]          0       
    Conv2D-141       [[1, 512, 28, 28]]    [1, 128, 28, 28]       65,536     
  BatchNorm2D-123    [[1, 128, 28, 28]]    [1, 128, 28, 28]         512      
      ReLU-49        [[1, 512, 28, 28]]    [1, 512, 28, 28]          0       
    Conv2D-142       [[1, 128, 28, 28]]    [1, 128, 28, 28]       147,456    
  BatchNorm2D-124    [[1, 128, 28, 28]]    [1, 128, 28, 28]         512      
    Conv2D-143       [[1, 128, 28, 28]]    [1, 512, 28, 28]       65,536     
  BatchNorm2D-125    [[1, 512, 28, 28]]    [1, 512, 28, 28]        2,048     
BottleneckBlock-39   [[1, 512, 28, 28]]    [1, 512, 28, 28]          0       
    Conv2D-144       [[1, 512, 28, 28]]    [1, 128, 28, 28]       65,536     
  BatchNorm2D-126    [[1, 128, 28, 28]]    [1, 128, 28, 28]         512      
      ReLU-50        [[1, 512, 28, 28]]    [1, 512, 28, 28]          0       
    Conv2D-145       [[1, 128, 28, 28]]    [1, 128, 28, 28]       147,456    
  BatchNorm2D-127    [[1, 128, 28, 28]]    [1, 128, 28, 28]         512      
    Conv2D-146       [[1, 128, 28, 28]]    [1, 512, 28, 28]       65,536     
  BatchNorm2D-128    [[1, 512, 28, 28]]    [1, 512, 28, 28]        2,048     
BottleneckBlock-40   [[1, 512, 28, 28]]    [1, 512, 28, 28]          0       
    Conv2D-148       [[1, 512, 28, 28]]    [1, 256, 28, 28]       131,072    
  BatchNorm2D-130    [[1, 256, 28, 28]]    [1, 256, 28, 28]        1,024     
      ReLU-51       [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-149       [[1, 256, 28, 28]]    [1, 256, 14, 14]       589,824    
  BatchNorm2D-131    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
    Conv2D-150       [[1, 256, 14, 14]]   [1, 1024, 14, 14]       262,144    
  BatchNorm2D-132   [[1, 1024, 14, 14]]   [1, 1024, 14, 14]        4,096     
    Conv2D-147       [[1, 512, 28, 28]]   [1, 1024, 14, 14]       524,288    
  BatchNorm2D-129   [[1, 1024, 14, 14]]   [1, 1024, 14, 14]        4,096     
BottleneckBlock-41   [[1, 512, 28, 28]]   [1, 1024, 14, 14]          0       
    Conv2D-151      [[1, 1024, 14, 14]]    [1, 256, 14, 14]       262,144    
  BatchNorm2D-133    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
      ReLU-52       [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-152       [[1, 256, 14, 14]]    [1, 256, 14, 14]       589,824    
  BatchNorm2D-134    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
    Conv2D-153       [[1, 256, 14, 14]]   [1, 1024, 14, 14]       262,144    
  BatchNorm2D-135   [[1, 1024, 14, 14]]   [1, 1024, 14, 14]        4,096     
BottleneckBlock-42  [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-154      [[1, 1024, 14, 14]]    [1, 256, 14, 14]       262,144    
  BatchNorm2D-136    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
      ReLU-53       [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-155       [[1, 256, 14, 14]]    [1, 256, 14, 14]       589,824    
  BatchNorm2D-137    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
    Conv2D-156       [[1, 256, 14, 14]]   [1, 1024, 14, 14]       262,144    
  BatchNorm2D-138   [[1, 1024, 14, 14]]   [1, 1024, 14, 14]        4,096     
BottleneckBlock-43  [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-157      [[1, 1024, 14, 14]]    [1, 256, 14, 14]       262,144    
  BatchNorm2D-139    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
      ReLU-54       [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-158       [[1, 256, 14, 14]]    [1, 256, 14, 14]       589,824    
  BatchNorm2D-140    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
    Conv2D-159       [[1, 256, 14, 14]]   [1, 1024, 14, 14]       262,144    
  BatchNorm2D-141   [[1, 1024, 14, 14]]   [1, 1024, 14, 14]        4,096     
BottleneckBlock-44  [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-160      [[1, 1024, 14, 14]]    [1, 256, 14, 14]       262,144    
  BatchNorm2D-142    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
      ReLU-55       [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-161       [[1, 256, 14, 14]]    [1, 256, 14, 14]       589,824    
  BatchNorm2D-143    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
    Conv2D-162       [[1, 256, 14, 14]]   [1, 1024, 14, 14]       262,144    
  BatchNorm2D-144   [[1, 1024, 14, 14]]   [1, 1024, 14, 14]        4,096     
BottleneckBlock-45  [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-163      [[1, 1024, 14, 14]]    [1, 256, 14, 14]       262,144    
  BatchNorm2D-145    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
      ReLU-56       [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-164       [[1, 256, 14, 14]]    [1, 256, 14, 14]       589,824    
  BatchNorm2D-146    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
    Conv2D-165       [[1, 256, 14, 14]]   [1, 1024, 14, 14]       262,144    
  BatchNorm2D-147   [[1, 1024, 14, 14]]   [1, 1024, 14, 14]        4,096     
BottleneckBlock-46  [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-166      [[1, 1024, 14, 14]]    [1, 256, 14, 14]       262,144    
  BatchNorm2D-148    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
      ReLU-57       [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-167       [[1, 256, 14, 14]]    [1, 256, 14, 14]       589,824    
  BatchNorm2D-149    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
    Conv2D-168       [[1, 256, 14, 14]]   [1, 1024, 14, 14]       262,144    
  BatchNorm2D-150   [[1, 1024, 14, 14]]   [1, 1024, 14, 14]        4,096     
BottleneckBlock-47  [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-169      [[1, 1024, 14, 14]]    [1, 256, 14, 14]       262,144    
  BatchNorm2D-151    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
      ReLU-58       [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-170       [[1, 256, 14, 14]]    [1, 256, 14, 14]       589,824    
  BatchNorm2D-152    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
    Conv2D-171       [[1, 256, 14, 14]]   [1, 1024, 14, 14]       262,144    
  BatchNorm2D-153   [[1, 1024, 14, 14]]   [1, 1024, 14, 14]        4,096     
BottleneckBlock-48  [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-172      [[1, 1024, 14, 14]]    [1, 256, 14, 14]       262,144    
  BatchNorm2D-154    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
      ReLU-59       [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-173       [[1, 256, 14, 14]]    [1, 256, 14, 14]       589,824    
  BatchNorm2D-155    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
    Conv2D-174       [[1, 256, 14, 14]]   [1, 1024, 14, 14]       262,144    
  BatchNorm2D-156   [[1, 1024, 14, 14]]   [1, 1024, 14, 14]        4,096     
BottleneckBlock-49  [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-175      [[1, 1024, 14, 14]]    [1, 256, 14, 14]       262,144    
  BatchNorm2D-157    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
      ReLU-60       [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-176       [[1, 256, 14, 14]]    [1, 256, 14, 14]       589,824    
  BatchNorm2D-158    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
    Conv2D-177       [[1, 256, 14, 14]]   [1, 1024, 14, 14]       262,144    
  BatchNorm2D-159   [[1, 1024, 14, 14]]   [1, 1024, 14, 14]        4,096     
BottleneckBlock-50  [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-178      [[1, 1024, 14, 14]]    [1, 256, 14, 14]       262,144    
  BatchNorm2D-160    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
      ReLU-61       [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-179       [[1, 256, 14, 14]]    [1, 256, 14, 14]       589,824    
  BatchNorm2D-161    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
    Conv2D-180       [[1, 256, 14, 14]]   [1, 1024, 14, 14]       262,144    
  BatchNorm2D-162   [[1, 1024, 14, 14]]   [1, 1024, 14, 14]        4,096     
BottleneckBlock-51  [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-181      [[1, 1024, 14, 14]]    [1, 256, 14, 14]       262,144    
  BatchNorm2D-163    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
      ReLU-62       [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-182       [[1, 256, 14, 14]]    [1, 256, 14, 14]       589,824    
  BatchNorm2D-164    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
    Conv2D-183       [[1, 256, 14, 14]]   [1, 1024, 14, 14]       262,144    
  BatchNorm2D-165   [[1, 1024, 14, 14]]   [1, 1024, 14, 14]        4,096     
BottleneckBlock-52  [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-184      [[1, 1024, 14, 14]]    [1, 256, 14, 14]       262,144    
  BatchNorm2D-166    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
      ReLU-63       [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-185       [[1, 256, 14, 14]]    [1, 256, 14, 14]       589,824    
  BatchNorm2D-167    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
    Conv2D-186       [[1, 256, 14, 14]]   [1, 1024, 14, 14]       262,144    
  BatchNorm2D-168   [[1, 1024, 14, 14]]   [1, 1024, 14, 14]        4,096     
BottleneckBlock-53  [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-187      [[1, 1024, 14, 14]]    [1, 256, 14, 14]       262,144    
  BatchNorm2D-169    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
      ReLU-64       [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-188       [[1, 256, 14, 14]]    [1, 256, 14, 14]       589,824    
  BatchNorm2D-170    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
    Conv2D-189       [[1, 256, 14, 14]]   [1, 1024, 14, 14]       262,144    
  BatchNorm2D-171   [[1, 1024, 14, 14]]   [1, 1024, 14, 14]        4,096     
BottleneckBlock-54  [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-190      [[1, 1024, 14, 14]]    [1, 256, 14, 14]       262,144    
  BatchNorm2D-172    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
      ReLU-65       [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-191       [[1, 256, 14, 14]]    [1, 256, 14, 14]       589,824    
  BatchNorm2D-173    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
    Conv2D-192       [[1, 256, 14, 14]]   [1, 1024, 14, 14]       262,144    
  BatchNorm2D-174   [[1, 1024, 14, 14]]   [1, 1024, 14, 14]        4,096     
BottleneckBlock-55  [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-193      [[1, 1024, 14, 14]]    [1, 256, 14, 14]       262,144    
  BatchNorm2D-175    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
      ReLU-66       [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-194       [[1, 256, 14, 14]]    [1, 256, 14, 14]       589,824    
  BatchNorm2D-176    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
    Conv2D-195       [[1, 256, 14, 14]]   [1, 1024, 14, 14]       262,144    
  BatchNorm2D-177   [[1, 1024, 14, 14]]   [1, 1024, 14, 14]        4,096     
BottleneckBlock-56  [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-196      [[1, 1024, 14, 14]]    [1, 256, 14, 14]       262,144    
  BatchNorm2D-178    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
      ReLU-67       [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-197       [[1, 256, 14, 14]]    [1, 256, 14, 14]       589,824    
  BatchNorm2D-179    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
    Conv2D-198       [[1, 256, 14, 14]]   [1, 1024, 14, 14]       262,144    
  BatchNorm2D-180   [[1, 1024, 14, 14]]   [1, 1024, 14, 14]        4,096     
BottleneckBlock-57  [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-199      [[1, 1024, 14, 14]]    [1, 256, 14, 14]       262,144    
  BatchNorm2D-181    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
      ReLU-68       [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-200       [[1, 256, 14, 14]]    [1, 256, 14, 14]       589,824    
  BatchNorm2D-182    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
    Conv2D-201       [[1, 256, 14, 14]]   [1, 1024, 14, 14]       262,144    
  BatchNorm2D-183   [[1, 1024, 14, 14]]   [1, 1024, 14, 14]        4,096     
BottleneckBlock-58  [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-202      [[1, 1024, 14, 14]]    [1, 256, 14, 14]       262,144    
  BatchNorm2D-184    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
      ReLU-69       [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-203       [[1, 256, 14, 14]]    [1, 256, 14, 14]       589,824    
  BatchNorm2D-185    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
    Conv2D-204       [[1, 256, 14, 14]]   [1, 1024, 14, 14]       262,144    
  BatchNorm2D-186   [[1, 1024, 14, 14]]   [1, 1024, 14, 14]        4,096     
BottleneckBlock-59  [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-205      [[1, 1024, 14, 14]]    [1, 256, 14, 14]       262,144    
  BatchNorm2D-187    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
      ReLU-70       [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-206       [[1, 256, 14, 14]]    [1, 256, 14, 14]       589,824    
  BatchNorm2D-188    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
    Conv2D-207       [[1, 256, 14, 14]]   [1, 1024, 14, 14]       262,144    
  BatchNorm2D-189   [[1, 1024, 14, 14]]   [1, 1024, 14, 14]        4,096     
BottleneckBlock-60  [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-208      [[1, 1024, 14, 14]]    [1, 256, 14, 14]       262,144    
  BatchNorm2D-190    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
      ReLU-71       [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-209       [[1, 256, 14, 14]]    [1, 256, 14, 14]       589,824    
  BatchNorm2D-191    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
    Conv2D-210       [[1, 256, 14, 14]]   [1, 1024, 14, 14]       262,144    
  BatchNorm2D-192   [[1, 1024, 14, 14]]   [1, 1024, 14, 14]        4,096     
BottleneckBlock-61  [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-211      [[1, 1024, 14, 14]]    [1, 256, 14, 14]       262,144    
  BatchNorm2D-193    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
      ReLU-72       [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-212       [[1, 256, 14, 14]]    [1, 256, 14, 14]       589,824    
  BatchNorm2D-194    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
    Conv2D-213       [[1, 256, 14, 14]]   [1, 1024, 14, 14]       262,144    
  BatchNorm2D-195   [[1, 1024, 14, 14]]   [1, 1024, 14, 14]        4,096     
BottleneckBlock-62  [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-214      [[1, 1024, 14, 14]]    [1, 256, 14, 14]       262,144    
  BatchNorm2D-196    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
      ReLU-73       [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-215       [[1, 256, 14, 14]]    [1, 256, 14, 14]       589,824    
  BatchNorm2D-197    [[1, 256, 14, 14]]    [1, 256, 14, 14]        1,024     
    Conv2D-216       [[1, 256, 14, 14]]   [1, 1024, 14, 14]       262,144    
  BatchNorm2D-198   [[1, 1024, 14, 14]]   [1, 1024, 14, 14]        4,096     
BottleneckBlock-63  [[1, 1024, 14, 14]]   [1, 1024, 14, 14]          0       
    Conv2D-218      [[1, 1024, 14, 14]]    [1, 512, 14, 14]       524,288    
  BatchNorm2D-200    [[1, 512, 14, 14]]    [1, 512, 14, 14]        2,048     
      ReLU-74        [[1, 2048, 7, 7]]     [1, 2048, 7, 7]           0       
    Conv2D-219       [[1, 512, 14, 14]]     [1, 512, 7, 7]       2,359,296   
  BatchNorm2D-201     [[1, 512, 7, 7]]      [1, 512, 7, 7]         2,048     
    Conv2D-220        [[1, 512, 7, 7]]     [1, 2048, 7, 7]       1,048,576   
  BatchNorm2D-202    [[1, 2048, 7, 7]]     [1, 2048, 7, 7]         8,192     
    Conv2D-217      [[1, 1024, 14, 14]]    [1, 2048, 7, 7]       2,097,152   
  BatchNorm2D-199    [[1, 2048, 7, 7]]     [1, 2048, 7, 7]         8,192     
BottleneckBlock-64  [[1, 1024, 14, 14]]    [1, 2048, 7, 7]           0       
    Conv2D-221       [[1, 2048, 7, 7]]      [1, 512, 7, 7]       1,048,576   
  BatchNorm2D-203     [[1, 512, 7, 7]]      [1, 512, 7, 7]         2,048     
      ReLU-75        [[1, 2048, 7, 7]]     [1, 2048, 7, 7]           0       
    Conv2D-222        [[1, 512, 7, 7]]      [1, 512, 7, 7]       2,359,296   
  BatchNorm2D-204     [[1, 512, 7, 7]]      [1, 512, 7, 7]         2,048     
    Conv2D-223        [[1, 512, 7, 7]]     [1, 2048, 7, 7]       1,048,576   
  BatchNorm2D-205    [[1, 2048, 7, 7]]     [1, 2048, 7, 7]         8,192     
BottleneckBlock-65   [[1, 2048, 7, 7]]     [1, 2048, 7, 7]           0       
    Conv2D-224       [[1, 2048, 7, 7]]      [1, 512, 7, 7]       1,048,576   
  BatchNorm2D-206     [[1, 512, 7, 7]]      [1, 512, 7, 7]         2,048     
      ReLU-76        [[1, 2048, 7, 7]]     [1, 2048, 7, 7]           0       
    Conv2D-225        [[1, 512, 7, 7]]      [1, 512, 7, 7]       2,359,296   
  BatchNorm2D-207     [[1, 512, 7, 7]]      [1, 512, 7, 7]         2,048     
    Conv2D-226        [[1, 512, 7, 7]]     [1, 2048, 7, 7]       1,048,576   
  BatchNorm2D-208    [[1, 2048, 7, 7]]     [1, 2048, 7, 7]         8,192     
BottleneckBlock-66   [[1, 2048, 7, 7]]     [1, 2048, 7, 7]           0       
AdaptiveAvgPool2D-3  [[1, 2048, 7, 7]]     [1, 2048, 1, 1]           0       
     Linear-33          [[1, 2048]]            [1, 10]            20,490     
===============================================================================
Total params: 42,625,994
Trainable params: 42,415,306
Non-trainable params: 210,688
-------------------------------------------------------------------------------
Input size (MB): 0.57
Forward/backward pass size (MB): 391.63
Params size (MB): 162.61
Estimated Total Size (MB): 554.81
-------------------------------------------------------------------------------
{'total_params': 42625994, 'trainable_params': 42415306}


梯度下降

为了判断学习的好坏,已经效率等,引入了梯度 的概念


20210305022848647.png


# 随机梯度下降算法的优化器
sgd_optimizer=paddle.optimizer.SGD(learning_rate=0.001, parameters=model.parameters())
# loss计算
mse_loss=paddle.nn.MSELoss()


训练模块


model.prepare(paddle.optimizer.Adam(learning_rate=0.001, parameters=model.parameters()),
              paddle.nn.CrossEntropyLoss(),   # 交叉熵损失函数。线性模型+该损失函数,即softmax分类器。
              paddle.metric.Accuracy(topk=(1,2)))
model.fit(train_dataset, # 训练数据集
          val_dataset,  # 测试数据集
          epochs=2, # 训练的总轮次
          batch_size=64, # 训练使用的批大小
          verbose=1)  # 日志展示形式
model.evaluate(test_dataset,batch_size=64,verbose=1)  # 评估


传说中的飞桨社区最差代码人,让我们一起努力!

记住:三岁出品必是精品 (不要脸系列

目录
相关文章
|
11天前
|
机器学习/深度学习 人工智能 算法
深入解析图神经网络:Graph Transformer的算法基础与工程实践
Graph Transformer是一种结合了Transformer自注意力机制与图神经网络(GNNs)特点的神经网络模型,专为处理图结构数据而设计。它通过改进的数据表示方法、自注意力机制、拉普拉斯位置编码、消息传递与聚合机制等核心技术,实现了对图中节点间关系信息的高效处理及长程依赖关系的捕捉,显著提升了图相关任务的性能。本文详细解析了Graph Transformer的技术原理、实现细节及应用场景,并通过图书推荐系统的实例,展示了其在实际问题解决中的强大能力。
91 30
|
16天前
|
SQL 安全 算法
网络安全之盾:漏洞防御与加密技术解析
在数字时代的浪潮中,网络安全和信息安全成为维护个人隐私和企业资产的重要防线。本文将深入探讨网络安全的薄弱环节—漏洞,并分析如何通过加密技术来加固这道防线。文章还将分享提升安全意识的重要性,以预防潜在的网络威胁,确保数据的安全与隐私。
32 2
|
18天前
|
安全 算法 网络安全
网络安全的盾牌与剑:漏洞防御与加密技术深度解析
在数字信息的海洋中,网络安全是航行者不可或缺的指南针。本文将深入探讨网络安全的两大支柱——漏洞防御和加密技术,揭示它们如何共同构筑起信息时代的安全屏障。从最新的网络攻击手段到防御策略,再到加密技术的奥秘,我们将一起揭开网络安全的神秘面纱,理解其背后的科学原理,并掌握保护个人和企业数据的关键技能。
26 3
|
20天前
|
网络协议
网络通信的基石:TCP/IP协议栈的层次结构解析
在现代网络通信中,TCP/IP协议栈是构建互联网的基础。它定义了数据如何在网络中传输,以及如何确保数据的完整性和可靠性。本文将深入探讨TCP/IP协议栈的层次结构,揭示每一层的功能和重要性。
52 5
|
22天前
|
网络协议 安全 文件存储
动态DNS(DDNS)技术在当前网络环境中日益重要,它允许使用动态IP地址的设备通过固定域名访问
动态DNS(DDNS)技术在当前网络环境中日益重要,它允许使用动态IP地址的设备通过固定域名访问,即使IP地址变化,也能通过DDNS服务保持连接。适用于家庭网络远程访问设备及企业临时或移动设备管理,提供便捷性和灵活性。示例代码展示了如何使用Python实现基本的DDNS更新。尽管存在服务可靠性和安全性挑战,DDNS仍极大提升了网络资源的利用效率。
43 6
|
20天前
|
监控 网络协议 网络性能优化
网络通信的核心选择:TCP与UDP协议深度解析
在网络通信领域,TCP(传输控制协议)和UDP(用户数据报协议)是两种基础且截然不同的传输层协议。它们各自的特点和适用场景对于网络工程师和开发者来说至关重要。本文将深入探讨TCP和UDP的核心区别,并分析它们在实际应用中的选择依据。
47 3
|
24天前
|
SQL 监控 安全
网络安全的盾牌与利剑:漏洞防御与加密技术解析
在数字时代的洪流中,网络安全如同一场没有硝烟的战争。本文将深入探讨网络安全的核心议题,从网络漏洞的发现到防御策略的实施,以及加密技术的运用,揭示保护信息安全的关键所在。通过实际案例分析,我们将一窥网络攻击的手段和防御的艺术,同时提升个人与企业的安全意识,共同构筑一道坚固的数字防线。
|
27天前
|
安全 算法 网络安全
网络安全的盾牌与剑:漏洞防御与加密技术解析
【10月更文挑战第42天】在数字时代的海洋中,网络安全是守护数据宝藏的坚固盾牌和锋利之剑。本文将揭示网络安全的两大支柱——漏洞防御和加密技术,通过深入浅出的方式,带你了解如何发现并堵塞安全漏洞,以及如何使用加密技术保护信息不被窃取。我们将一起探索网络安全的奥秘,让你成为信息时代的智者和守护者。
34 6
|
27天前
|
存储 SQL 安全
网络安全的屏障与钥匙:漏洞防御与加密技术解析
【10月更文挑战第42天】在数字时代的浪潮中,网络安全成为守护个人隐私与企业数据不被侵犯的关键防线。本文将深入探讨网络安全中的两大核心议题——漏洞防御和加密技术。我们将从网络漏洞的识别开始,逐步揭示如何通过有效的安全策略和技术手段来防范潜在的网络攻击。随后,文章将转向加密技术的奥秘,解读其在数据传输和存储过程中保护信息安全的作用机制。最后,强调提升个人和企业的安全意识,是构建坚固网络安全屏障的重要一环。
|
25天前
|
SQL 人工智能 安全
网络安全的盾牌:漏洞防护与加密技术解析
在数字时代的浪潮中,网络安全和信息安全成为了维护社会稳定和保护个人隐私的关键。本文将深入探讨网络安全中的常见漏洞、先进的加密技术以及提升安全意识的重要性。通过分析网络攻击的手法,揭示防御策略的构建过程,并分享实用的代码示例,旨在为读者提供一套全面的网络安全知识体系,以增强个人和组织在网络空间的防御能力。

推荐镜像

更多