GhostNet架构复现--CVPR2020

简介: 由于内存和计算资源有限,在嵌入式设备上部署卷积神经网络 (CNN) 很困难。特征图中的冗余是那些成功的 CNN 的一个重要特征,但在神经架构设计中很少被研究。**本文提出了一种新颖的 Ghost 模块,可以从廉价的操作中生成更多的特征图。基于一组内在特征图,我们应用一系列成本低廉的线性变换来生成许多ghost特征图,这些特征图可以充分揭示内在特征的信息。所提出的 Ghost 模块可以作为一个即插即用的组件来升级现有的卷积神经网络。 Ghost 瓶颈旨在堆叠 Ghost 模块,然后可以轻松建立轻量级的 GhostNet。**
参考论文:GhostNet: More Features from Cheap Operations

作者:Kai Han, Yunhe Wang, Qi Tian, Jianyuan Guo, Chunjing Xu, Chang Xu

1、论文摘要

  由于内存和计算资源有限,在嵌入式设备上部署卷积神经网络 (CNN) 很困难。特征图中的冗余是那些成功的 CNN 的一个重要特征,但在神经架构设计中很少被研究。本文提出了一种新颖的 Ghost 模块,可以从廉价的操作中生成更多的特征图。基于一组内在特征图,我们应用一系列成本低廉的线性变换来生成许多ghost特征图,这些特征图可以充分揭示内在特征的信息。所提出的 Ghost 模块可以作为一个即插即用的组件来升级现有的卷积神经网络。 Ghost 瓶颈旨在堆叠 Ghost 模块,然后可以轻松建立轻量级的 GhostNet。

  在基准上进行的实验表明,所提出的 Ghost 模块是基线模型中卷积层的一个令人印象深刻的替代方案,我们的 GhostNet 可以在 ImageNet ILSVRC2012 分类数据集上以相似的计算成本实现比 MobileNetV3 更高的识别性能(例如 75.7% 的 top-1 准确率) .代码在 https://github.com/huawei-noah/ghostnet

2、Ghost Module

  参考b导:神经网络学习小记录58——Keras GhostNet模型的复现详解

image-20220905220818540

   图 2. 卷积层和提出的用于输出相同数量特征图的 Ghost 模块的图示。 Φ代表廉价操作。

  Ghost Module将普通卷积分为两部分,首先进行一个普通的1x1卷积,这是一个少量卷积,比如正常使用32通道的卷积,这里就用16通道的卷积,这个1x1卷积的作用类似于特征整合,生成输入特征层的特征浓缩

  然后我们再进行depthwise convolution,这个深度可分离卷积是逐层卷积,它利用上一步获得的特征浓缩生成Ghost特征图

  Ghost Module总结:

  • 利用1x1卷积获得输入特征的必要特征浓缩
  • 利用Depthwise Convolution获得特征浓缩的相似特征图

3、Ghost Bottleneck

image-20220905221305175

   图 3. Ghost bottleneck。左:stride=1 的Ghost bottleneck;右图:stride=2 的Ghost bottleneck。

  提出的 Ghost 瓶颈主要由两个堆叠的 Ghost 模块组成。

  第一个 Ghost 模块充当增加通道数量的扩展层。我们将输出通道数与输入通道数之比称为扩展比。第二个 Ghost 模块减少通道数以匹配shortcut path。然后在这两个 Ghost 模块的输入和输出之间使用残差连接

  批标准化 (BN) [25] 和 ReLU 非线性在每一层之后应用,除了 MobileNetV2 [44] 建议的第二个 Ghost 模块之后不使用 ReLU。

  当stride=2的时候,残差边上会加一个步长为2的Depthwise Convolution和一个1x1普通卷积。

4、GhostNet网络结构

image-20220905221751745

  表 1. GhostNet 的整体架构。 G-bneck 表示 Ghost bottleneck。 #exp 表示扩展大小。 #out 表示输出通道的数量。 SE 表示是否使用 SE 模块。

  上述 Ghost bottleneck适用于 stride=1。对于 stride=2 的情况,通过下采样层实现shortcut path,并在两个 Ghost 模块之间插入 stride=2 的深度卷积。在实践中,Ghost 模块中的主要卷积是逐点卷积以提高效率。

  第一层是一个标准的卷积层,有 16 个卷积核,然后是一系列通道逐渐增加的 Ghost bottleneck。这些 Ghost bottleneck根据其输入特征图的大小分为不同的阶段。除了每个阶段的最后一个使用 stride=2 之外,所有 Ghost 瓶颈都使用 stride=1。

  最后,使用全局平均池和卷积层将特征图转换为 1280 维特征向量以进行最终分类

  挤压和激发 (SE) 模块 [22] 也应用于一些ghost bottleneck中的残差层。

5、Tensorflow代码复现

import tensorflow as tf
import math
from tensorflow.keras import layers
from tensorflow.keras.models import Model
from plot_model import plot_model

5.1 SE注意力机制模块

def se_block(input_feature, ratio=4, name=None):
    # 获取通道数
    channel = input_feature.shape[-1]
    se_filters = int(channel // ratio)

    x = layers.GlobalAveragePooling2D()(input_feature)

    x = layers.Reshape((1, 1, channel))(x)

    # 两个FC,使用两个1x1卷积代替
    x = layers.Conv2D(filters=se_filters,
                      kernel_size=(1, 1),
                      activation='relu',
                      kernel_initializer='he_normal',
                      use_bias=False)(x)
    x = layers.Conv2D(filters=channel,
                      kernel_size=(1, 1),
                      kernel_initializer='he_normal',
                      use_bias=False)(x)
    # x = layers.Activation('sigmoid')(x)
    x = layers.Activation('hard_sigmoid')(x)
    # 将权值乘上原输入的特征层即可。
    out = layers.multiply([input_feature, x])
    return out
  这块参考大佬的文章和源码,发现用的是hard_sigmoid激活函数。

5.2 Ghost Module

def GhostModule(inputs, exp, ratio, stride=1, relu=True):
    # ratio一般取2,
    # 因为最后x和dw有个堆叠操作,这样才能保证输出特征层的通道数等于exp
    output_channels = math.ceil(exp * 1.0 / ratio)
    # 1x1 conv
    x = layers.Conv2D(filters=output_channels,
                      kernel_size=(1, 1),
                      strides=stride,
                      padding='same',
                      use_bias=False)(inputs)
    x = layers.BatchNormalization()(x)
    if relu:
        x = layers.ReLU()(x)
    # depthwise convolution
    dw = layers.DepthwiseConv2D(kernel_size=(3, 3),
                                strides=stride,
                                padding='same',
                                use_bias=False)(x)
    dw = layers.BatchNormalization()(dw)
    if relu:
        dw = layers.ReLU()(dw)
    # 在通道维度堆叠
    x = layers.concatenate([x, dw], axis=-1)
    return x

5.3 Ghost Bottleneck

def GhostBottleNeck(inputs,  # 输入张量
                    output_channel,  # 输出通道数
                    kernel,  # depthwise conv的卷积核大小
                    strides,  # 步长
                    exp_channel,  # expandion size
                    ratio,  # GhostModule中第一个1*1卷积下降的通道数,一般为2
                    se):  # 是否使用SE注意力机制
    x = GhostModule(inputs, exp=exp_channel, ratio=ratio, relu=True)

    # stride=2的时候需要在两个Ghost模块之间插入stride=2的depthwise conv
    if strides > 1:
        x = layers.DepthwiseConv2D(kernel_size=kernel,
                                   strides=strides,
                                   padding='same',
                                   use_bias=False)(x)
        x = layers.BatchNormalization()(x)

    # 是否使用SE注意力机制
    if se:
        x = se_block(x)
    # 第二个Ghost Module
    x = GhostModule(x, exp=output_channel, ratio=ratio, relu=False)

    # 当stride=1的时候,且输入和输出特征图的shape相等,直接使用shortcut connection
    if strides == 1 and inputs.shape[-1] == x.shape[-1]:
        res = inputs
    # 经过stride=2的下采样时,对残差边进行2x2DWConv和1x1Conv
    else:
        # 2x2 DWConv
        res = layers.DepthwiseConv2D(kernel_size=kernel,
                                     strides=strides,
                                     padding='same',
                                     use_bias=False)(inputs)
        res = layers.BatchNormalization()(res)
        # 1x1Conv
        res = layers.Conv2D(filters=output_channel,
                            kernel_size=(1, 1),
                            strides=1,
                            padding='same',
                            use_bias=False)(res)
        res = layers.BatchNormalization()(res)
    x = layers.Add()([x, res])
    return x

5.4 GhostNet网络搭建

  源码中的配置:

image-20220905222703897

  源码的编码风格看着好难受,所以我参考了一些其他大佬的复现代码。
def GhostNet(input_shape=(224, 224, 3), classes=1000, ratio=2):
    inputs = layers.Input(shape=input_shape)
    # 第一个标准卷积
    x = layers.Conv2D(filters=16, kernel_size=(3, 3), strides=2, padding='same',
                      use_bias=False)(inputs)
    x = layers.BatchNormalization()(x)
    x = layers.ReLU()(x)

    x = GhostBottleNeck(x, output_channel=16, kernel=(3, 3), strides=1, exp_channel=16, ratio=ratio, se=False)

    x = GhostBottleNeck(x, output_channel=24, kernel=(3, 3), strides=2, exp_channel=48, ratio=ratio, se=False)

    x = GhostBottleNeck(x, output_channel=24, kernel=(3, 3), strides=1, exp_channel=72, ratio=ratio, se=False)
    x = GhostBottleNeck(x, output_channel=40, kernel=(5, 5), strides=2, exp_channel=72, ratio=ratio, se=True)

    x = GhostBottleNeck(x, output_channel=40, kernel=(5, 5), strides=1, exp_channel=120, ratio=ratio, se=True)
    x = GhostBottleNeck(x, output_channel=80, kernel=(3, 3), strides=2, exp_channel=240, ratio=ratio, se=False)

    x = GhostBottleNeck(x, output_channel=80, kernel=(3, 3), strides=1, exp_channel=200, ratio=ratio, se=False)
    x = GhostBottleNeck(x, output_channel=80, kernel=(3, 3), strides=1, exp_channel=184, ratio=ratio, se=False)
    x = GhostBottleNeck(x, output_channel=80, kernel=(3, 3), strides=1, exp_channel=184, ratio=ratio, se=False)
    x = GhostBottleNeck(x, output_channel=112, kernel=(3, 3), strides=1, exp_channel=480, ratio=ratio, se=True)
    x = GhostBottleNeck(x, output_channel=112, kernel=(3, 3), strides=1, exp_channel=672, ratio=ratio, se=True)
    x = GhostBottleNeck(x, output_channel=160, kernel=(5, 5), strides=2, exp_channel=672, ratio=ratio, se=True)

    x = GhostBottleNeck(x, output_channel=160, kernel=(5, 5), strides=1, exp_channel=960, ratio=ratio, se=False)
    x = GhostBottleNeck(x, output_channel=160, kernel=(5, 5), strides=1, exp_channel=960, ratio=ratio, se=True)
    x = GhostBottleNeck(x, output_channel=160, kernel=(5, 5), strides=1, exp_channel=960, ratio=ratio, se=False)
    x = GhostBottleNeck(x, output_channel=160, kernel=(5, 5), strides=1, exp_channel=960, ratio=ratio, se=True)

    x = layers.Conv2D(filters=960, kernel_size=(1, 1), strides=1, padding='same', use_bias=False)(x)
    x = layers.BatchNormalization()(x)
    x = layers.ReLU()(x)

    # 全局平均池化
    x = layers.GlobalAveragePooling2D()(x)
    # 因为后面还有个conv2d,所以这里reshape一下。
    x = layers.Reshape((1, 1, x.shape[-1]))(x)

    x = layers.Conv2D(filters=1280, kernel_size=(1, 1), strides=1, padding='same', use_bias=False)(x)
    x = layers.BatchNormalization()(x)
    x = layers.ReLU()(x)

    x=layers.Flatten()(x)
    out=layers.Dense(classes,activation='softmax')(x)

    # 构建模型
    model = Model(inputs=inputs, outputs=out)

    return model

5.5 模型摘要

if __name__ == '__main__':
    model = GhostNet(input_shape=(224, 224, 3), classes=1000)
    model.summary()
Model: "model"
__________________________________________________________________________________________________
Layer (type)                    Output Shape         Param #     Connected to                     
==================================================================================================
input_1 (InputLayer)            [(None, 224, 224, 3) 0                                            
__________________________________________________________________________________________________
conv2d (Conv2D)                 (None, 112, 112, 16) 432         input_1[0][0]                    
__________________________________________________________________________________________________
batch_normalization (BatchNorma (None, 112, 112, 16) 64          conv2d[0][0]                     
__________________________________________________________________________________________________
re_lu (ReLU)                    (None, 112, 112, 16) 0           batch_normalization[0][0]        
__________________________________________________________________________________________________
conv2d_1 (Conv2D)               (None, 112, 112, 8)  128         re_lu[0][0]                      
__________________________________________________________________________________________________
batch_normalization_1 (BatchNor (None, 112, 112, 8)  32          conv2d_1[0][0]                   
__________________________________________________________________________________________________
re_lu_1 (ReLU)                  (None, 112, 112, 8)  0           batch_normalization_1[0][0]      
__________________________________________________________________________________________________
depthwise_conv2d (DepthwiseConv (None, 112, 112, 8)  72          re_lu_1[0][0]                    
__________________________________________________________________________________________________
batch_normalization_2 (BatchNor (None, 112, 112, 8)  32          depthwise_conv2d[0][0]           
__________________________________________________________________________________________________
re_lu_2 (ReLU)                  (None, 112, 112, 8)  0           batch_normalization_2[0][0]      
__________________________________________________________________________________________________
concatenate (Concatenate)       (None, 112, 112, 16) 0           re_lu_1[0][0]                    
                                                                 re_lu_2[0][0]                    
__________________________________________________________________________________________________
conv2d_2 (Conv2D)               (None, 112, 112, 8)  128         concatenate[0][0]                
__________________________________________________________________________________________________
batch_normalization_3 (BatchNor (None, 112, 112, 8)  32          conv2d_2[0][0]                   
__________________________________________________________________________________________________
depthwise_conv2d_1 (DepthwiseCo (None, 112, 112, 8)  72          batch_normalization_3[0][0]      
__________________________________________________________________________________________________
batch_normalization_4 (BatchNor (None, 112, 112, 8)  32          depthwise_conv2d_1[0][0]         
__________________________________________________________________________________________________
concatenate_1 (Concatenate)     (None, 112, 112, 16) 0           batch_normalization_3[0][0]      
                                                                 batch_normalization_4[0][0]      
__________________________________________________________________________________________________
add (Add)                       (None, 112, 112, 16) 0           concatenate_1[0][0]              
                                                                 re_lu[0][0]                      
__________________________________________________________________________________________________
conv2d_3 (Conv2D)               (None, 112, 112, 24) 384         add[0][0]                        
__________________________________________________________________________________________________
batch_normalization_5 (BatchNor (None, 112, 112, 24) 96          conv2d_3[0][0]                   
__________________________________________________________________________________________________
re_lu_3 (ReLU)                  (None, 112, 112, 24) 0           batch_normalization_5[0][0]      
__________________________________________________________________________________________________
depthwise_conv2d_2 (DepthwiseCo (None, 112, 112, 24) 216         re_lu_3[0][0]                    
__________________________________________________________________________________________________
batch_normalization_6 (BatchNor (None, 112, 112, 24) 96          depthwise_conv2d_2[0][0]         
__________________________________________________________________________________________________
re_lu_4 (ReLU)                  (None, 112, 112, 24) 0           batch_normalization_6[0][0]      
__________________________________________________________________________________________________
concatenate_2 (Concatenate)     (None, 112, 112, 48) 0           re_lu_3[0][0]                    
                                                                 re_lu_4[0][0]                    
__________________________________________________________________________________________________
depthwise_conv2d_3 (DepthwiseCo (None, 56, 56, 48)   432         concatenate_2[0][0]              
__________________________________________________________________________________________________
batch_normalization_7 (BatchNor (None, 56, 56, 48)   192         depthwise_conv2d_3[0][0]         
__________________________________________________________________________________________________
conv2d_4 (Conv2D)               (None, 56, 56, 12)   576         batch_normalization_7[0][0]      
__________________________________________________________________________________________________
batch_normalization_8 (BatchNor (None, 56, 56, 12)   48          conv2d_4[0][0]                   
__________________________________________________________________________________________________
depthwise_conv2d_5 (DepthwiseCo (None, 56, 56, 16)   144         add[0][0]                        
__________________________________________________________________________________________________
depthwise_conv2d_4 (DepthwiseCo (None, 56, 56, 12)   108         batch_normalization_8[0][0]      
__________________________________________________________________________________________________
batch_normalization_10 (BatchNo (None, 56, 56, 16)   64          depthwise_conv2d_5[0][0]         
__________________________________________________________________________________________________
batch_normalization_9 (BatchNor (None, 56, 56, 12)   48          depthwise_conv2d_4[0][0]         
__________________________________________________________________________________________________
conv2d_5 (Conv2D)               (None, 56, 56, 24)   384         batch_normalization_10[0][0]     
__________________________________________________________________________________________________
concatenate_3 (Concatenate)     (None, 56, 56, 24)   0           batch_normalization_8[0][0]      
                                                                 batch_normalization_9[0][0]      
__________________________________________________________________________________________________
batch_normalization_11 (BatchNo (None, 56, 56, 24)   96          conv2d_5[0][0]                   
__________________________________________________________________________________________________
add_1 (Add)                     (None, 56, 56, 24)   0           concatenate_3[0][0]              
                                                                 batch_normalization_11[0][0]     
__________________________________________________________________________________________________
conv2d_6 (Conv2D)               (None, 56, 56, 36)   864         add_1[0][0]                      
__________________________________________________________________________________________________
batch_normalization_12 (BatchNo (None, 56, 56, 36)   144         conv2d_6[0][0]                   
__________________________________________________________________________________________________
re_lu_5 (ReLU)                  (None, 56, 56, 36)   0           batch_normalization_12[0][0]     
__________________________________________________________________________________________________
depthwise_conv2d_6 (DepthwiseCo (None, 56, 56, 36)   324         re_lu_5[0][0]                    
__________________________________________________________________________________________________
batch_normalization_13 (BatchNo (None, 56, 56, 36)   144         depthwise_conv2d_6[0][0]         
__________________________________________________________________________________________________
re_lu_6 (ReLU)                  (None, 56, 56, 36)   0           batch_normalization_13[0][0]     
__________________________________________________________________________________________________
concatenate_4 (Concatenate)     (None, 56, 56, 72)   0           re_lu_5[0][0]                    
                                                                 re_lu_6[0][0]                    
__________________________________________________________________________________________________
conv2d_7 (Conv2D)               (None, 56, 56, 12)   864         concatenate_4[0][0]              
__________________________________________________________________________________________________
batch_normalization_14 (BatchNo (None, 56, 56, 12)   48          conv2d_7[0][0]                   
__________________________________________________________________________________________________
depthwise_conv2d_7 (DepthwiseCo (None, 56, 56, 12)   108         batch_normalization_14[0][0]     
__________________________________________________________________________________________________
batch_normalization_15 (BatchNo (None, 56, 56, 12)   48          depthwise_conv2d_7[0][0]         
__________________________________________________________________________________________________
concatenate_5 (Concatenate)     (None, 56, 56, 24)   0           batch_normalization_14[0][0]     
                                                                 batch_normalization_15[0][0]     
__________________________________________________________________________________________________
add_2 (Add)                     (None, 56, 56, 24)   0           concatenate_5[0][0]              
                                                                 add_1[0][0]                      
__________________________________________________________________________________________________
conv2d_8 (Conv2D)               (None, 56, 56, 36)   864         add_2[0][0]                      
__________________________________________________________________________________________________
batch_normalization_16 (BatchNo (None, 56, 56, 36)   144         conv2d_8[0][0]                   
__________________________________________________________________________________________________
re_lu_7 (ReLU)                  (None, 56, 56, 36)   0           batch_normalization_16[0][0]     
__________________________________________________________________________________________________
depthwise_conv2d_8 (DepthwiseCo (None, 56, 56, 36)   324         re_lu_7[0][0]                    
__________________________________________________________________________________________________
batch_normalization_17 (BatchNo (None, 56, 56, 36)   144         depthwise_conv2d_8[0][0]         
__________________________________________________________________________________________________
re_lu_8 (ReLU)                  (None, 56, 56, 36)   0           batch_normalization_17[0][0]     
__________________________________________________________________________________________________
concatenate_6 (Concatenate)     (None, 56, 56, 72)   0           re_lu_7[0][0]                    
                                                                 re_lu_8[0][0]                    
__________________________________________________________________________________________________
depthwise_conv2d_9 (DepthwiseCo (None, 28, 28, 72)   1800        concatenate_6[0][0]              
__________________________________________________________________________________________________
batch_normalization_18 (BatchNo (None, 28, 28, 72)   288         depthwise_conv2d_9[0][0]         
__________________________________________________________________________________________________
global_average_pooling2d (Globa (None, 72)           0           batch_normalization_18[0][0]     
__________________________________________________________________________________________________
reshape (Reshape)               (None, 1, 1, 72)     0           global_average_pooling2d[0][0]   
__________________________________________________________________________________________________
conv2d_9 (Conv2D)               (None, 1, 1, 18)     1296        reshape[0][0]                    
__________________________________________________________________________________________________
conv2d_10 (Conv2D)              (None, 1, 1, 72)     1296        conv2d_9[0][0]                   
__________________________________________________________________________________________________
activation (Activation)         (None, 1, 1, 72)     0           conv2d_10[0][0]                  
__________________________________________________________________________________________________
multiply (Multiply)             (None, 28, 28, 72)   0           batch_normalization_18[0][0]     
                                                                 activation[0][0]                 
__________________________________________________________________________________________________
conv2d_11 (Conv2D)              (None, 28, 28, 20)   1440        multiply[0][0]                   
__________________________________________________________________________________________________
batch_normalization_19 (BatchNo (None, 28, 28, 20)   80          conv2d_11[0][0]                  
__________________________________________________________________________________________________
depthwise_conv2d_11 (DepthwiseC (None, 28, 28, 24)   600         add_2[0][0]                      
__________________________________________________________________________________________________
depthwise_conv2d_10 (DepthwiseC (None, 28, 28, 20)   180         batch_normalization_19[0][0]     
__________________________________________________________________________________________________
batch_normalization_21 (BatchNo (None, 28, 28, 24)   96          depthwise_conv2d_11[0][0]        
__________________________________________________________________________________________________
batch_normalization_20 (BatchNo (None, 28, 28, 20)   80          depthwise_conv2d_10[0][0]        
__________________________________________________________________________________________________
conv2d_12 (Conv2D)              (None, 28, 28, 40)   960         batch_normalization_21[0][0]     
__________________________________________________________________________________________________
concatenate_7 (Concatenate)     (None, 28, 28, 40)   0           batch_normalization_19[0][0]     
                                                                 batch_normalization_20[0][0]     
__________________________________________________________________________________________________
batch_normalization_22 (BatchNo (None, 28, 28, 40)   160         conv2d_12[0][0]                  
__________________________________________________________________________________________________
add_3 (Add)                     (None, 28, 28, 40)   0           concatenate_7[0][0]              
                                                                 batch_normalization_22[0][0]     
__________________________________________________________________________________________________
conv2d_13 (Conv2D)              (None, 28, 28, 60)   2400        add_3[0][0]                      
__________________________________________________________________________________________________
batch_normalization_23 (BatchNo (None, 28, 28, 60)   240         conv2d_13[0][0]                  
__________________________________________________________________________________________________
re_lu_9 (ReLU)                  (None, 28, 28, 60)   0           batch_normalization_23[0][0]     
__________________________________________________________________________________________________
depthwise_conv2d_12 (DepthwiseC (None, 28, 28, 60)   540         re_lu_9[0][0]                    
__________________________________________________________________________________________________
batch_normalization_24 (BatchNo (None, 28, 28, 60)   240         depthwise_conv2d_12[0][0]        
__________________________________________________________________________________________________
re_lu_10 (ReLU)                 (None, 28, 28, 60)   0           batch_normalization_24[0][0]     
__________________________________________________________________________________________________
concatenate_8 (Concatenate)     (None, 28, 28, 120)  0           re_lu_9[0][0]                    
                                                                 re_lu_10[0][0]                   
__________________________________________________________________________________________________
global_average_pooling2d_1 (Glo (None, 120)          0           concatenate_8[0][0]              
__________________________________________________________________________________________________
reshape_1 (Reshape)             (None, 1, 1, 120)    0           global_average_pooling2d_1[0][0] 
__________________________________________________________________________________________________
conv2d_14 (Conv2D)              (None, 1, 1, 30)     3600        reshape_1[0][0]                  
__________________________________________________________________________________________________
conv2d_15 (Conv2D)              (None, 1, 1, 120)    3600        conv2d_14[0][0]                  
__________________________________________________________________________________________________
activation_1 (Activation)       (None, 1, 1, 120)    0           conv2d_15[0][0]                  
__________________________________________________________________________________________________
multiply_1 (Multiply)           (None, 28, 28, 120)  0           concatenate_8[0][0]              
                                                                 activation_1[0][0]               
__________________________________________________________________________________________________
conv2d_16 (Conv2D)              (None, 28, 28, 20)   2400        multiply_1[0][0]                 
__________________________________________________________________________________________________
batch_normalization_25 (BatchNo (None, 28, 28, 20)   80          conv2d_16[0][0]                  
__________________________________________________________________________________________________
depthwise_conv2d_13 (DepthwiseC (None, 28, 28, 20)   180         batch_normalization_25[0][0]     
__________________________________________________________________________________________________
batch_normalization_26 (BatchNo (None, 28, 28, 20)   80          depthwise_conv2d_13[0][0]        
__________________________________________________________________________________________________
concatenate_9 (Concatenate)     (None, 28, 28, 40)   0           batch_normalization_25[0][0]     
                                                                 batch_normalization_26[0][0]     
__________________________________________________________________________________________________
add_4 (Add)                     (None, 28, 28, 40)   0           concatenate_9[0][0]              
                                                                 add_3[0][0]                      
__________________________________________________________________________________________________
conv2d_17 (Conv2D)              (None, 28, 28, 120)  4800        add_4[0][0]                      
__________________________________________________________________________________________________
batch_normalization_27 (BatchNo (None, 28, 28, 120)  480         conv2d_17[0][0]                  
__________________________________________________________________________________________________
re_lu_11 (ReLU)                 (None, 28, 28, 120)  0           batch_normalization_27[0][0]     
__________________________________________________________________________________________________
depthwise_conv2d_14 (DepthwiseC (None, 28, 28, 120)  1080        re_lu_11[0][0]                   
__________________________________________________________________________________________________
batch_normalization_28 (BatchNo (None, 28, 28, 120)  480         depthwise_conv2d_14[0][0]        
__________________________________________________________________________________________________
re_lu_12 (ReLU)                 (None, 28, 28, 120)  0           batch_normalization_28[0][0]     
__________________________________________________________________________________________________
concatenate_10 (Concatenate)    (None, 28, 28, 240)  0           re_lu_11[0][0]                   
                                                                 re_lu_12[0][0]                   
__________________________________________________________________________________________________
depthwise_conv2d_15 (DepthwiseC (None, 14, 14, 240)  2160        concatenate_10[0][0]             
__________________________________________________________________________________________________
batch_normalization_29 (BatchNo (None, 14, 14, 240)  960         depthwise_conv2d_15[0][0]        
__________________________________________________________________________________________________
conv2d_18 (Conv2D)              (None, 14, 14, 40)   9600        batch_normalization_29[0][0]     
__________________________________________________________________________________________________
batch_normalization_30 (BatchNo (None, 14, 14, 40)   160         conv2d_18[0][0]                  
__________________________________________________________________________________________________
depthwise_conv2d_17 (DepthwiseC (None, 14, 14, 40)   360         add_4[0][0]                      
__________________________________________________________________________________________________
depthwise_conv2d_16 (DepthwiseC (None, 14, 14, 40)   360         batch_normalization_30[0][0]     
__________________________________________________________________________________________________
batch_normalization_32 (BatchNo (None, 14, 14, 40)   160         depthwise_conv2d_17[0][0]        
__________________________________________________________________________________________________
batch_normalization_31 (BatchNo (None, 14, 14, 40)   160         depthwise_conv2d_16[0][0]        
__________________________________________________________________________________________________
conv2d_19 (Conv2D)              (None, 14, 14, 80)   3200        batch_normalization_32[0][0]     
__________________________________________________________________________________________________
concatenate_11 (Concatenate)    (None, 14, 14, 80)   0           batch_normalization_30[0][0]     
                                                                 batch_normalization_31[0][0]     
__________________________________________________________________________________________________
batch_normalization_33 (BatchNo (None, 14, 14, 80)   320         conv2d_19[0][0]                  
__________________________________________________________________________________________________
add_5 (Add)                     (None, 14, 14, 80)   0           concatenate_11[0][0]             
                                                                 batch_normalization_33[0][0]     
__________________________________________________________________________________________________
conv2d_20 (Conv2D)              (None, 14, 14, 100)  8000        add_5[0][0]                      
__________________________________________________________________________________________________
batch_normalization_34 (BatchNo (None, 14, 14, 100)  400         conv2d_20[0][0]                  
__________________________________________________________________________________________________
re_lu_13 (ReLU)                 (None, 14, 14, 100)  0           batch_normalization_34[0][0]     
__________________________________________________________________________________________________
depthwise_conv2d_18 (DepthwiseC (None, 14, 14, 100)  900         re_lu_13[0][0]                   
__________________________________________________________________________________________________
batch_normalization_35 (BatchNo (None, 14, 14, 100)  400         depthwise_conv2d_18[0][0]        
__________________________________________________________________________________________________
re_lu_14 (ReLU)                 (None, 14, 14, 100)  0           batch_normalization_35[0][0]     
__________________________________________________________________________________________________
concatenate_12 (Concatenate)    (None, 14, 14, 200)  0           re_lu_13[0][0]                   
                                                                 re_lu_14[0][0]                   
__________________________________________________________________________________________________
conv2d_21 (Conv2D)              (None, 14, 14, 40)   8000        concatenate_12[0][0]             
__________________________________________________________________________________________________
batch_normalization_36 (BatchNo (None, 14, 14, 40)   160         conv2d_21[0][0]                  
__________________________________________________________________________________________________
depthwise_conv2d_19 (DepthwiseC (None, 14, 14, 40)   360         batch_normalization_36[0][0]     
__________________________________________________________________________________________________
batch_normalization_37 (BatchNo (None, 14, 14, 40)   160         depthwise_conv2d_19[0][0]        
__________________________________________________________________________________________________
concatenate_13 (Concatenate)    (None, 14, 14, 80)   0           batch_normalization_36[0][0]     
                                                                 batch_normalization_37[0][0]     
__________________________________________________________________________________________________
add_6 (Add)                     (None, 14, 14, 80)   0           concatenate_13[0][0]             
                                                                 add_5[0][0]                      
__________________________________________________________________________________________________
conv2d_22 (Conv2D)              (None, 14, 14, 92)   7360        add_6[0][0]                      
__________________________________________________________________________________________________
batch_normalization_38 (BatchNo (None, 14, 14, 92)   368         conv2d_22[0][0]                  
__________________________________________________________________________________________________
re_lu_15 (ReLU)                 (None, 14, 14, 92)   0           batch_normalization_38[0][0]     
__________________________________________________________________________________________________
depthwise_conv2d_20 (DepthwiseC (None, 14, 14, 92)   828         re_lu_15[0][0]                   
__________________________________________________________________________________________________
batch_normalization_39 (BatchNo (None, 14, 14, 92)   368         depthwise_conv2d_20[0][0]        
__________________________________________________________________________________________________
re_lu_16 (ReLU)                 (None, 14, 14, 92)   0           batch_normalization_39[0][0]     
__________________________________________________________________________________________________
concatenate_14 (Concatenate)    (None, 14, 14, 184)  0           re_lu_15[0][0]                   
                                                                 re_lu_16[0][0]                   
__________________________________________________________________________________________________
conv2d_23 (Conv2D)              (None, 14, 14, 40)   7360        concatenate_14[0][0]             
__________________________________________________________________________________________________
batch_normalization_40 (BatchNo (None, 14, 14, 40)   160         conv2d_23[0][0]                  
__________________________________________________________________________________________________
depthwise_conv2d_21 (DepthwiseC (None, 14, 14, 40)   360         batch_normalization_40[0][0]     
__________________________________________________________________________________________________
batch_normalization_41 (BatchNo (None, 14, 14, 40)   160         depthwise_conv2d_21[0][0]        
__________________________________________________________________________________________________
concatenate_15 (Concatenate)    (None, 14, 14, 80)   0           batch_normalization_40[0][0]     
                                                                 batch_normalization_41[0][0]     
__________________________________________________________________________________________________
add_7 (Add)                     (None, 14, 14, 80)   0           concatenate_15[0][0]             
                                                                 add_6[0][0]                      
__________________________________________________________________________________________________
conv2d_24 (Conv2D)              (None, 14, 14, 92)   7360        add_7[0][0]                      
__________________________________________________________________________________________________
batch_normalization_42 (BatchNo (None, 14, 14, 92)   368         conv2d_24[0][0]                  
__________________________________________________________________________________________________
re_lu_17 (ReLU)                 (None, 14, 14, 92)   0           batch_normalization_42[0][0]     
__________________________________________________________________________________________________
depthwise_conv2d_22 (DepthwiseC (None, 14, 14, 92)   828         re_lu_17[0][0]                   
__________________________________________________________________________________________________
batch_normalization_43 (BatchNo (None, 14, 14, 92)   368         depthwise_conv2d_22[0][0]        
__________________________________________________________________________________________________
re_lu_18 (ReLU)                 (None, 14, 14, 92)   0           batch_normalization_43[0][0]     
__________________________________________________________________________________________________
concatenate_16 (Concatenate)    (None, 14, 14, 184)  0           re_lu_17[0][0]                   
                                                                 re_lu_18[0][0]                   
__________________________________________________________________________________________________
conv2d_25 (Conv2D)              (None, 14, 14, 40)   7360        concatenate_16[0][0]             
__________________________________________________________________________________________________
batch_normalization_44 (BatchNo (None, 14, 14, 40)   160         conv2d_25[0][0]                  
__________________________________________________________________________________________________
depthwise_conv2d_23 (DepthwiseC (None, 14, 14, 40)   360         batch_normalization_44[0][0]     
__________________________________________________________________________________________________
batch_normalization_45 (BatchNo (None, 14, 14, 40)   160         depthwise_conv2d_23[0][0]        
__________________________________________________________________________________________________
concatenate_17 (Concatenate)    (None, 14, 14, 80)   0           batch_normalization_44[0][0]     
                                                                 batch_normalization_45[0][0]     
__________________________________________________________________________________________________
add_8 (Add)                     (None, 14, 14, 80)   0           concatenate_17[0][0]             
                                                                 add_7[0][0]                      
__________________________________________________________________________________________________
conv2d_26 (Conv2D)              (None, 14, 14, 240)  19200       add_8[0][0]                      
__________________________________________________________________________________________________
batch_normalization_46 (BatchNo (None, 14, 14, 240)  960         conv2d_26[0][0]                  
__________________________________________________________________________________________________
re_lu_19 (ReLU)                 (None, 14, 14, 240)  0           batch_normalization_46[0][0]     
__________________________________________________________________________________________________
depthwise_conv2d_24 (DepthwiseC (None, 14, 14, 240)  2160        re_lu_19[0][0]                   
__________________________________________________________________________________________________
batch_normalization_47 (BatchNo (None, 14, 14, 240)  960         depthwise_conv2d_24[0][0]        
__________________________________________________________________________________________________
re_lu_20 (ReLU)                 (None, 14, 14, 240)  0           batch_normalization_47[0][0]     
__________________________________________________________________________________________________
concatenate_18 (Concatenate)    (None, 14, 14, 480)  0           re_lu_19[0][0]                   
                                                                 re_lu_20[0][0]                   
__________________________________________________________________________________________________
global_average_pooling2d_2 (Glo (None, 480)          0           concatenate_18[0][0]             
__________________________________________________________________________________________________
reshape_2 (Reshape)             (None, 1, 1, 480)    0           global_average_pooling2d_2[0][0] 
__________________________________________________________________________________________________
conv2d_27 (Conv2D)              (None, 1, 1, 120)    57600       reshape_2[0][0]                  
__________________________________________________________________________________________________
conv2d_28 (Conv2D)              (None, 1, 1, 480)    57600       conv2d_27[0][0]                  
__________________________________________________________________________________________________
activation_2 (Activation)       (None, 1, 1, 480)    0           conv2d_28[0][0]                  
__________________________________________________________________________________________________
multiply_2 (Multiply)           (None, 14, 14, 480)  0           concatenate_18[0][0]             
                                                                 activation_2[0][0]               
__________________________________________________________________________________________________
conv2d_29 (Conv2D)              (None, 14, 14, 56)   26880       multiply_2[0][0]                 
__________________________________________________________________________________________________
batch_normalization_48 (BatchNo (None, 14, 14, 56)   224         conv2d_29[0][0]                  
__________________________________________________________________________________________________
depthwise_conv2d_26 (DepthwiseC (None, 14, 14, 80)   720         add_8[0][0]                      
__________________________________________________________________________________________________
depthwise_conv2d_25 (DepthwiseC (None, 14, 14, 56)   504         batch_normalization_48[0][0]     
__________________________________________________________________________________________________
batch_normalization_50 (BatchNo (None, 14, 14, 80)   320         depthwise_conv2d_26[0][0]        
__________________________________________________________________________________________________
batch_normalization_49 (BatchNo (None, 14, 14, 56)   224         depthwise_conv2d_25[0][0]        
__________________________________________________________________________________________________
conv2d_30 (Conv2D)              (None, 14, 14, 112)  8960        batch_normalization_50[0][0]     
__________________________________________________________________________________________________
concatenate_19 (Concatenate)    (None, 14, 14, 112)  0           batch_normalization_48[0][0]     
                                                                 batch_normalization_49[0][0]     
__________________________________________________________________________________________________
batch_normalization_51 (BatchNo (None, 14, 14, 112)  448         conv2d_30[0][0]                  
__________________________________________________________________________________________________
add_9 (Add)                     (None, 14, 14, 112)  0           concatenate_19[0][0]             
                                                                 batch_normalization_51[0][0]     
__________________________________________________________________________________________________
conv2d_31 (Conv2D)              (None, 14, 14, 336)  37632       add_9[0][0]                      
__________________________________________________________________________________________________
batch_normalization_52 (BatchNo (None, 14, 14, 336)  1344        conv2d_31[0][0]                  
__________________________________________________________________________________________________
re_lu_21 (ReLU)                 (None, 14, 14, 336)  0           batch_normalization_52[0][0]     
__________________________________________________________________________________________________
depthwise_conv2d_27 (DepthwiseC (None, 14, 14, 336)  3024        re_lu_21[0][0]                   
__________________________________________________________________________________________________
batch_normalization_53 (BatchNo (None, 14, 14, 336)  1344        depthwise_conv2d_27[0][0]        
__________________________________________________________________________________________________
re_lu_22 (ReLU)                 (None, 14, 14, 336)  0           batch_normalization_53[0][0]     
__________________________________________________________________________________________________
concatenate_20 (Concatenate)    (None, 14, 14, 672)  0           re_lu_21[0][0]                   
                                                                 re_lu_22[0][0]                   
__________________________________________________________________________________________________
global_average_pooling2d_3 (Glo (None, 672)          0           concatenate_20[0][0]             
__________________________________________________________________________________________________
reshape_3 (Reshape)             (None, 1, 1, 672)    0           global_average_pooling2d_3[0][0] 
__________________________________________________________________________________________________
conv2d_32 (Conv2D)              (None, 1, 1, 168)    112896      reshape_3[0][0]                  
__________________________________________________________________________________________________
conv2d_33 (Conv2D)              (None, 1, 1, 672)    112896      conv2d_32[0][0]                  
__________________________________________________________________________________________________
activation_3 (Activation)       (None, 1, 1, 672)    0           conv2d_33[0][0]                  
__________________________________________________________________________________________________
multiply_3 (Multiply)           (None, 14, 14, 672)  0           concatenate_20[0][0]             
                                                                 activation_3[0][0]               
__________________________________________________________________________________________________
conv2d_34 (Conv2D)              (None, 14, 14, 56)   37632       multiply_3[0][0]                 
__________________________________________________________________________________________________
batch_normalization_54 (BatchNo (None, 14, 14, 56)   224         conv2d_34[0][0]                  
__________________________________________________________________________________________________
depthwise_conv2d_28 (DepthwiseC (None, 14, 14, 56)   504         batch_normalization_54[0][0]     
__________________________________________________________________________________________________
batch_normalization_55 (BatchNo (None, 14, 14, 56)   224         depthwise_conv2d_28[0][0]        
__________________________________________________________________________________________________
concatenate_21 (Concatenate)    (None, 14, 14, 112)  0           batch_normalization_54[0][0]     
                                                                 batch_normalization_55[0][0]     
__________________________________________________________________________________________________
add_10 (Add)                    (None, 14, 14, 112)  0           concatenate_21[0][0]             
                                                                 add_9[0][0]                      
__________________________________________________________________________________________________
conv2d_35 (Conv2D)              (None, 14, 14, 336)  37632       add_10[0][0]                     
__________________________________________________________________________________________________
batch_normalization_56 (BatchNo (None, 14, 14, 336)  1344        conv2d_35[0][0]                  
__________________________________________________________________________________________________
re_lu_23 (ReLU)                 (None, 14, 14, 336)  0           batch_normalization_56[0][0]     
__________________________________________________________________________________________________
depthwise_conv2d_29 (DepthwiseC (None, 14, 14, 336)  3024        re_lu_23[0][0]                   
__________________________________________________________________________________________________
batch_normalization_57 (BatchNo (None, 14, 14, 336)  1344        depthwise_conv2d_29[0][0]        
__________________________________________________________________________________________________
re_lu_24 (ReLU)                 (None, 14, 14, 336)  0           batch_normalization_57[0][0]     
__________________________________________________________________________________________________
concatenate_22 (Concatenate)    (None, 14, 14, 672)  0           re_lu_23[0][0]                   
                                                                 re_lu_24[0][0]                   
__________________________________________________________________________________________________
depthwise_conv2d_30 (DepthwiseC (None, 7, 7, 672)    16800       concatenate_22[0][0]             
__________________________________________________________________________________________________
batch_normalization_58 (BatchNo (None, 7, 7, 672)    2688        depthwise_conv2d_30[0][0]        
__________________________________________________________________________________________________
global_average_pooling2d_4 (Glo (None, 672)          0           batch_normalization_58[0][0]     
__________________________________________________________________________________________________
reshape_4 (Reshape)             (None, 1, 1, 672)    0           global_average_pooling2d_4[0][0] 
__________________________________________________________________________________________________
conv2d_36 (Conv2D)              (None, 1, 1, 168)    112896      reshape_4[0][0]                  
__________________________________________________________________________________________________
conv2d_37 (Conv2D)              (None, 1, 1, 672)    112896      conv2d_36[0][0]                  
__________________________________________________________________________________________________
activation_4 (Activation)       (None, 1, 1, 672)    0           conv2d_37[0][0]                  
__________________________________________________________________________________________________
multiply_4 (Multiply)           (None, 7, 7, 672)    0           batch_normalization_58[0][0]     
                                                                 activation_4[0][0]               
__________________________________________________________________________________________________
conv2d_38 (Conv2D)              (None, 7, 7, 80)     53760       multiply_4[0][0]                 
__________________________________________________________________________________________________
batch_normalization_59 (BatchNo (None, 7, 7, 80)     320         conv2d_38[0][0]                  
__________________________________________________________________________________________________
depthwise_conv2d_32 (DepthwiseC (None, 7, 7, 112)    2800        add_10[0][0]                     
__________________________________________________________________________________________________
depthwise_conv2d_31 (DepthwiseC (None, 7, 7, 80)     720         batch_normalization_59[0][0]     
__________________________________________________________________________________________________
batch_normalization_61 (BatchNo (None, 7, 7, 112)    448         depthwise_conv2d_32[0][0]        
__________________________________________________________________________________________________
batch_normalization_60 (BatchNo (None, 7, 7, 80)     320         depthwise_conv2d_31[0][0]        
__________________________________________________________________________________________________
conv2d_39 (Conv2D)              (None, 7, 7, 160)    17920       batch_normalization_61[0][0]     
__________________________________________________________________________________________________
concatenate_23 (Concatenate)    (None, 7, 7, 160)    0           batch_normalization_59[0][0]     
                                                                 batch_normalization_60[0][0]     
__________________________________________________________________________________________________
batch_normalization_62 (BatchNo (None, 7, 7, 160)    640         conv2d_39[0][0]                  
__________________________________________________________________________________________________
add_11 (Add)                    (None, 7, 7, 160)    0           concatenate_23[0][0]             
                                                                 batch_normalization_62[0][0]     
__________________________________________________________________________________________________
conv2d_40 (Conv2D)              (None, 7, 7, 480)    76800       add_11[0][0]                     
__________________________________________________________________________________________________
batch_normalization_63 (BatchNo (None, 7, 7, 480)    1920        conv2d_40[0][0]                  
__________________________________________________________________________________________________
re_lu_25 (ReLU)                 (None, 7, 7, 480)    0           batch_normalization_63[0][0]     
__________________________________________________________________________________________________
depthwise_conv2d_33 (DepthwiseC (None, 7, 7, 480)    4320        re_lu_25[0][0]                   
__________________________________________________________________________________________________
batch_normalization_64 (BatchNo (None, 7, 7, 480)    1920        depthwise_conv2d_33[0][0]        
__________________________________________________________________________________________________
re_lu_26 (ReLU)                 (None, 7, 7, 480)    0           batch_normalization_64[0][0]     
__________________________________________________________________________________________________
concatenate_24 (Concatenate)    (None, 7, 7, 960)    0           re_lu_25[0][0]                   
                                                                 re_lu_26[0][0]                   
__________________________________________________________________________________________________
conv2d_41 (Conv2D)              (None, 7, 7, 80)     76800       concatenate_24[0][0]             
__________________________________________________________________________________________________
batch_normalization_65 (BatchNo (None, 7, 7, 80)     320         conv2d_41[0][0]                  
__________________________________________________________________________________________________
depthwise_conv2d_34 (DepthwiseC (None, 7, 7, 80)     720         batch_normalization_65[0][0]     
__________________________________________________________________________________________________
batch_normalization_66 (BatchNo (None, 7, 7, 80)     320         depthwise_conv2d_34[0][0]        
__________________________________________________________________________________________________
concatenate_25 (Concatenate)    (None, 7, 7, 160)    0           batch_normalization_65[0][0]     
                                                                 batch_normalization_66[0][0]     
__________________________________________________________________________________________________
add_12 (Add)                    (None, 7, 7, 160)    0           concatenate_25[0][0]             
                                                                 add_11[0][0]                     
__________________________________________________________________________________________________
conv2d_42 (Conv2D)              (None, 7, 7, 480)    76800       add_12[0][0]                     
__________________________________________________________________________________________________
batch_normalization_67 (BatchNo (None, 7, 7, 480)    1920        conv2d_42[0][0]                  
__________________________________________________________________________________________________
re_lu_27 (ReLU)                 (None, 7, 7, 480)    0           batch_normalization_67[0][0]     
__________________________________________________________________________________________________
depthwise_conv2d_35 (DepthwiseC (None, 7, 7, 480)    4320        re_lu_27[0][0]                   
__________________________________________________________________________________________________
batch_normalization_68 (BatchNo (None, 7, 7, 480)    1920        depthwise_conv2d_35[0][0]        
__________________________________________________________________________________________________
re_lu_28 (ReLU)                 (None, 7, 7, 480)    0           batch_normalization_68[0][0]     
__________________________________________________________________________________________________
concatenate_26 (Concatenate)    (None, 7, 7, 960)    0           re_lu_27[0][0]                   
                                                                 re_lu_28[0][0]                   
__________________________________________________________________________________________________
global_average_pooling2d_5 (Glo (None, 960)          0           concatenate_26[0][0]             
__________________________________________________________________________________________________
reshape_5 (Reshape)             (None, 1, 1, 960)    0           global_average_pooling2d_5[0][0] 
__________________________________________________________________________________________________
conv2d_43 (Conv2D)              (None, 1, 1, 240)    230400      reshape_5[0][0]                  
__________________________________________________________________________________________________
conv2d_44 (Conv2D)              (None, 1, 1, 960)    230400      conv2d_43[0][0]                  
__________________________________________________________________________________________________
activation_5 (Activation)       (None, 1, 1, 960)    0           conv2d_44[0][0]                  
__________________________________________________________________________________________________
multiply_5 (Multiply)           (None, 7, 7, 960)    0           concatenate_26[0][0]             
                                                                 activation_5[0][0]               
__________________________________________________________________________________________________
conv2d_45 (Conv2D)              (None, 7, 7, 80)     76800       multiply_5[0][0]                 
__________________________________________________________________________________________________
batch_normalization_69 (BatchNo (None, 7, 7, 80)     320         conv2d_45[0][0]                  
__________________________________________________________________________________________________
depthwise_conv2d_36 (DepthwiseC (None, 7, 7, 80)     720         batch_normalization_69[0][0]     
__________________________________________________________________________________________________
batch_normalization_70 (BatchNo (None, 7, 7, 80)     320         depthwise_conv2d_36[0][0]        
__________________________________________________________________________________________________
concatenate_27 (Concatenate)    (None, 7, 7, 160)    0           batch_normalization_69[0][0]     
                                                                 batch_normalization_70[0][0]     
__________________________________________________________________________________________________
add_13 (Add)                    (None, 7, 7, 160)    0           concatenate_27[0][0]             
                                                                 add_12[0][0]                     
__________________________________________________________________________________________________
conv2d_46 (Conv2D)              (None, 7, 7, 480)    76800       add_13[0][0]                     
__________________________________________________________________________________________________
batch_normalization_71 (BatchNo (None, 7, 7, 480)    1920        conv2d_46[0][0]                  
__________________________________________________________________________________________________
re_lu_29 (ReLU)                 (None, 7, 7, 480)    0           batch_normalization_71[0][0]     
__________________________________________________________________________________________________
depthwise_conv2d_37 (DepthwiseC (None, 7, 7, 480)    4320        re_lu_29[0][0]                   
__________________________________________________________________________________________________
batch_normalization_72 (BatchNo (None, 7, 7, 480)    1920        depthwise_conv2d_37[0][0]        
__________________________________________________________________________________________________
re_lu_30 (ReLU)                 (None, 7, 7, 480)    0           batch_normalization_72[0][0]     
__________________________________________________________________________________________________
concatenate_28 (Concatenate)    (None, 7, 7, 960)    0           re_lu_29[0][0]                   
                                                                 re_lu_30[0][0]                   
__________________________________________________________________________________________________
conv2d_47 (Conv2D)              (None, 7, 7, 80)     76800       concatenate_28[0][0]             
__________________________________________________________________________________________________
batch_normalization_73 (BatchNo (None, 7, 7, 80)     320         conv2d_47[0][0]                  
__________________________________________________________________________________________________
depthwise_conv2d_38 (DepthwiseC (None, 7, 7, 80)     720         batch_normalization_73[0][0]     
__________________________________________________________________________________________________
batch_normalization_74 (BatchNo (None, 7, 7, 80)     320         depthwise_conv2d_38[0][0]        
__________________________________________________________________________________________________
concatenate_29 (Concatenate)    (None, 7, 7, 160)    0           batch_normalization_73[0][0]     
                                                                 batch_normalization_74[0][0]     
__________________________________________________________________________________________________
add_14 (Add)                    (None, 7, 7, 160)    0           concatenate_29[0][0]             
                                                                 add_13[0][0]                     
__________________________________________________________________________________________________
conv2d_48 (Conv2D)              (None, 7, 7, 480)    76800       add_14[0][0]                     
__________________________________________________________________________________________________
batch_normalization_75 (BatchNo (None, 7, 7, 480)    1920        conv2d_48[0][0]                  
__________________________________________________________________________________________________
re_lu_31 (ReLU)                 (None, 7, 7, 480)    0           batch_normalization_75[0][0]     
__________________________________________________________________________________________________
depthwise_conv2d_39 (DepthwiseC (None, 7, 7, 480)    4320        re_lu_31[0][0]                   
__________________________________________________________________________________________________
batch_normalization_76 (BatchNo (None, 7, 7, 480)    1920        depthwise_conv2d_39[0][0]        
__________________________________________________________________________________________________
re_lu_32 (ReLU)                 (None, 7, 7, 480)    0           batch_normalization_76[0][0]     
__________________________________________________________________________________________________
concatenate_30 (Concatenate)    (None, 7, 7, 960)    0           re_lu_31[0][0]                   
                                                                 re_lu_32[0][0]                   
__________________________________________________________________________________________________
global_average_pooling2d_6 (Glo (None, 960)          0           concatenate_30[0][0]             
__________________________________________________________________________________________________
reshape_6 (Reshape)             (None, 1, 1, 960)    0           global_average_pooling2d_6[0][0] 
__________________________________________________________________________________________________
conv2d_49 (Conv2D)              (None, 1, 1, 240)    230400      reshape_6[0][0]                  
__________________________________________________________________________________________________
conv2d_50 (Conv2D)              (None, 1, 1, 960)    230400      conv2d_49[0][0]                  
__________________________________________________________________________________________________
activation_6 (Activation)       (None, 1, 1, 960)    0           conv2d_50[0][0]                  
__________________________________________________________________________________________________
multiply_6 (Multiply)           (None, 7, 7, 960)    0           concatenate_30[0][0]             
                                                                 activation_6[0][0]               
__________________________________________________________________________________________________
conv2d_51 (Conv2D)              (None, 7, 7, 80)     76800       multiply_6[0][0]                 
__________________________________________________________________________________________________
batch_normalization_77 (BatchNo (None, 7, 7, 80)     320         conv2d_51[0][0]                  
__________________________________________________________________________________________________
depthwise_conv2d_40 (DepthwiseC (None, 7, 7, 80)     720         batch_normalization_77[0][0]     
__________________________________________________________________________________________________
batch_normalization_78 (BatchNo (None, 7, 7, 80)     320         depthwise_conv2d_40[0][0]        
__________________________________________________________________________________________________
concatenate_31 (Concatenate)    (None, 7, 7, 160)    0           batch_normalization_77[0][0]     
                                                                 batch_normalization_78[0][0]     
__________________________________________________________________________________________________
add_15 (Add)                    (None, 7, 7, 160)    0           concatenate_31[0][0]             
                                                                 add_14[0][0]                     
__________________________________________________________________________________________________
conv2d_52 (Conv2D)              (None, 7, 7, 960)    153600      add_15[0][0]                     
__________________________________________________________________________________________________
batch_normalization_79 (BatchNo (None, 7, 7, 960)    3840        conv2d_52[0][0]                  
__________________________________________________________________________________________________
re_lu_33 (ReLU)                 (None, 7, 7, 960)    0           batch_normalization_79[0][0]     
__________________________________________________________________________________________________
global_average_pooling2d_7 (Glo (None, 960)          0           re_lu_33[0][0]                   
__________________________________________________________________________________________________
reshape_7 (Reshape)             (None, 1, 1, 960)    0           global_average_pooling2d_7[0][0] 
__________________________________________________________________________________________________
conv2d_53 (Conv2D)              (None, 1, 1, 1280)   1228800     reshape_7[0][0]                  
__________________________________________________________________________________________________
batch_normalization_80 (BatchNo (None, 1, 1, 1280)   5120        conv2d_53[0][0]                  
__________________________________________________________________________________________________
re_lu_34 (ReLU)                 (None, 1, 1, 1280)   0           batch_normalization_80[0][0]     
__________________________________________________________________________________________________
flatten (Flatten)               (None, 1280)         0           re_lu_34[0][0]                   
__________________________________________________________________________________________________
dense (Dense)                   (None, 1000)         1281000     flatten[0][0]                    
==================================================================================================
Total params: 5,202,624
Trainable params: 5,178,096
Non-trainable params: 24,528

5.6 模型结构图

image-20220905222617954

References

https://github.com/huawei-noah/Efficient-AI-Backbones/blob/master/ghostnet_tensorflow/ghostnet.py

GhostNet: More Features from Cheap Operations

神经网络小记录58--Keras GhostNet模型的复现详解

GhostNet 代码复现,网络解析,附TensorFlow完整代码,(这个大佬的代码有几处写的不对,应该是失误了)

目录
相关文章
|
机器学习/深度学习 编解码 BI
RegNet架构复现--CVPR2020
在这项工作中,我们**提出了一种新的网络设计范式**。我们的目标是帮助促进对网络设计的理解,并发现跨环境通用的设计原则。我们不是专注于设计单个网络实例,而是设计参数化网络群体的网络设计空间。整个过程类似于经典的网络手动设计,但提升到了设计空间级别。使用我们的方法,我们探索了网络设计的结构方面,并**得出了一个由简单、规则的网络组成的低维设计空间,我们称之为** ==RegNet==。
1326 0
RegNet架构复现--CVPR2020
|
机器学习/深度学习 编解码 TensorFlow
MobileNetV3架构解析与代码复现
MobileNet模型基于深度可分离卷积,这是一种分解卷积的形式,将标准卷积分解为深度卷积和`1*1`的点卷积。对于MobileNet,深度卷积将单个滤波器应用于每个输入通道,然后,逐点卷积应用`1*1`卷积将输出与深度卷积相结合。
1058 0
MobileNetV3架构解析与代码复现
|
机器学习/深度学习 编解码 TensorFlow
MnasNet架构解析与复现-神经架构搜索
为移动设备设计卷积神经网络 (CNN) 具有挑战性,因为移动模型需要小而快,但仍要准确。尽管在所有维度上都致力于设计和改进移动 CNN,但当需要考虑如此多的架构可能性时,很难手动平衡这些权衡。在本文中,我们提出了一种**自动移动神经架构搜索 (MNAS) 方法**,该方法明确地将模型延迟纳入主要目标,以便搜索可以识别出在准确性和延迟之间取得良好折衷的模型。与之前的工作不同,延迟是通过另一个通常不准确的代理(例如 FLOPS)来考虑的,我们的方法通过在手机上执行模型来直接测量现实世界的推理延迟。为了进一步在灵活性和搜索空间大小之间取得适当的平衡,我们**提出了一种新颖的分解层次搜索空间,它鼓励整
541 0
MnasNet架构解析与复现-神经架构搜索
|
机器学习/深度学习 TensorFlow API
EffiecientNetV2架构复现--CVPR2021
这篇文章介绍了EfficientNetV2,与以前的模型相比,它具有更快的训练速度和更好的参数效率。为了开发这些模型,我们结合使用训练感知神经架构搜索和缩放,共同优化训练速度和参数效率。这些模型是从富含新操作(如 Fused-MBConv)的搜索空间中搜索的。我们的实验表明,EfficientNetV2 模型的训练速度比最先进的模型快得多,同时体积缩小了 6.8 倍。
516 0
EffiecientNetV2架构复现--CVPR2021
|
编解码 数据挖掘 算法框架/工具
ResNet-RS架构复现--CVPR2021
我们的工作重新审视了规范的 ResNet (He et al., 2015),并研究了这三个方面,以试图解开它们。也许令人惊讶的是,我们发现训练和扩展策略可能比架构变化更重要,而且由此产生的 ResNet 与最近最先进的模型相匹配。**我们展示了表现最佳的缩放策略取决于训练方案,并提供了两种新的缩放策略:(1)在可能发生过度拟合的情况下缩放模型深度(否则宽度缩放更可取); (2) 提高图像分辨率的速度比之前推荐的要慢(Tan & Le,2019)**。使用改进的训练和扩展策略,我们设计了一系列 ResNet 架构 **ResNet-RS**,它比 TPU 上的 EfficientNets 快
478 0
ResNet-RS架构复现--CVPR2021
|
8天前
|
弹性计算 Kubernetes Cloud Native
云原生架构下的微服务设计原则与实践####
本文深入探讨了在云原生环境中,微服务架构的设计原则、关键技术及实践案例。通过剖析传统单体架构面临的挑战,引出微服务作为解决方案的优势,并详细阐述了微服务设计的几大核心原则:单一职责、独立部署、弹性伸缩和服务自治。文章还介绍了容器化技术、Kubernetes等云原生工具如何助力微服务的高效实施,并通过一个实际项目案例,展示了从服务拆分到持续集成/持续部署(CI/CD)流程的完整实现路径,为读者提供了宝贵的实践经验和启发。 ####
|
5天前
|
监控 安全 应用服务中间件
微服务架构下的API网关设计策略与实践####
本文深入探讨了在微服务架构下,API网关作为系统统一入口点的设计策略、实现细节及其在实际应用中的最佳实践。不同于传统的摘要概述,本部分将直接以一段精简的代码示例作为引子,展示一个基于NGINX的简单API网关配置片段,随后引出文章的核心内容,旨在通过具体实例激发读者兴趣,快速理解API网关在微服务架构中的关键作用及实现方式。 ```nginx server { listen 80; server_name api.example.com; location / { proxy_pass http://backend_service:5000;
|
7天前
|
缓存 监控 API
探索微服务架构中的API网关模式
随着微服务架构的兴起,API网关成为管理和服务间交互的关键组件。本文通过在线零售公司的案例,探讨了API网关在路由管理、认证授权、限流缓存、日志监控和协议转换等方面的优势,并详细介绍了使用Kong实现API网关的具体步骤。
24 3
|
8天前
|
运维 NoSQL Java
后端架构演进:微服务架构的优缺点与实战案例分析
【10月更文挑战第28天】本文探讨了微服务架构与单体架构的优缺点,并通过实战案例分析了微服务架构在实际应用中的表现。微服务架构具有高内聚、低耦合、独立部署等优势,但也面临分布式系统的复杂性和较高的运维成本。通过某电商平台的实际案例,展示了微服务架构在提升系统性能和团队协作效率方面的显著效果,同时也指出了其带来的挑战。
44 4
|
7天前
|
存储 缓存 监控
探索微服务架构中的API网关模式
探索微服务架构中的API网关模式
24 2
下一篇
无影云桌面