经典CNN算法解析实战-第J6周:ResNeXt-50实战解析

  • 本文为365天深度学习训练营中的学习记录博客
  • 原作者:K同学啊|接辅导、项目定制

目录

  • 一、课题背景和开发环境
  • 二、模型结构
  • 三、分组卷积
  • 四、Pytorch复现ResNext-50模型
    • 1.分组卷积模块
    • 2.定义残差单元
    • 3.堆叠残差单元
    • 4.搭建ResNext-50网络
    • 5.查看模型摘要

一、课题背景和开发环境

第J6周:ResNeXt-50实战解析

  • 语言:Python3、Pytorch
  • 本周任务:
    – 1. 阅读ResNeXt论文,了解作者的构建思路
    – 2. 对比我们之前介绍的ResNet50V2、DenseNet算法
    – 3.使用ResNeXt-50算法完成猴痘病识别

二、模型结构

ResNeXt是由何凯明团队在2017年CVPR会议上提出来的新型图像分类网络。ResNeXt是ResNet的升级版,在ResNet的基础上,引入了cardinality的概念,类似于ResNet,ResNeXt也有ResNeXt-50,ResNeXt-101的版本。ResNeXt论文原文如下:
Aggregated Residual Transformations for Deep Neural Networks.pdf

在ResNeXt的论文中,作者提出了当时普遍存在的一个问题,如果要提高模型的准确率,往往采取加深网络或者加宽网络的方法。虽然这种方法是有效的,但是随之而来的,是网络设计的难度和计算开销的增加。为了一点精度的提升往往需要付出更大的代价。因此,需要一个更好的策略,在不额外增加计算代价的情况下,提升网络的精度。由此,何等人提出了cardinality的概念。

下图是ResNet(左)与ResNeXt(右)block的差异。在ResNet中,输入的具有256个通道的特征经过1×1卷积压缩4倍到64个通道,之后3×3的卷积核用于处理特征,经1×1卷积扩大通道数与原特征残差连接后输出。ResNeXt也是相同的处理策略,但在ResNeXt中,输入的具有256个通道的特征被分为32个组,每组被压缩64倍到4个通道后进行处理。32个组相加后与原特征残差连接后输出。这里cardinatity指的是一个block中所具有的相同分支的数目。
经典CNN算法解析实战-第J6周:ResNeXt-50实战解析_第1张图片

三、分组卷积

ResNeXt中采用的分组卷机简单来说就是将特征图分为不同的组,再对每组特征图分别进行卷积,这个操作可以有效的降低计算量。

在分组卷积中,每个卷积核只处理部分通道,比如下图中,红色卷积核只处理红色的通道,绿色卷积核只处理绿色通道,黄色卷积核只处理黄色通道。此时每个卷积核有2个通道,每个卷积核生成一张特征图。
经典CNN算法解析实战-第J6周:ResNeXt-50实战解析_第2张图片
经典CNN算法解析实战-第J6周:ResNeXt-50实战解析_第3张图片

四、Pytorch复现ResNext-50模型

1.分组卷积模块

pytorch

class GroupedConvBlock(nn.Module):
    def __init__(self, in_channel, kernel_size=3, stride=1, groups=32):
        super(GroupedConvBlock, self).__init__()
        self.g_channel = in_channel//groups
        self.groups = groups
        self.conv = nn.Conv2d(self.g_channel, self.g_channel, kernel_size=3, stride=stride, padding=1, bias=False)
        self.norm = nn.BatchNorm2d(in_channel)
        self.relu = nn.ReLU(inplace=True)
        
    
    def forward(self, x):
        g_list = []
        # 分组进行卷积
        for c in range(self.groups):
            g = x[:,c*self.g_channel:(c+1)*self.g_channel,:,:]
            g = self.conv(g)
            g_list.append(g)
        x = torch.cat(g_list, dim=1)
        x = self.norm(x)
        x = self.relu(x)
        return x

2.定义残差单元

pytorch

''' Residual Block '''
class Block(nn.Module):
    def __init__(self, in_channel, filters, kernel_size=3, stride=1, groups=32, conv_shortcut=True):
        super(Block, self).__init__()
        self.shortcut = conv_shortcut
        if self.shortcut:
            self.short = nn.Conv2d(in_channel, 2*filters, kernel_size=1, stride=stride, padding=0, bias=False)
        elif stride>1:
            self.short = nn.MaxPool2d(kernel_size=1, stride=stride, padding=0)
        else:
            self.short = nn.Identity()
        
        self.conv1 = nn.Sequential(
            nn.Conv2d(in_channel, filters, kernel_size=1, stride=1, bias=False),
            nn.BatchNorm2d(filters),
            nn.ReLU(True)
        )
        self.conv2 = GroupedConvBlock(in_channel=filters, kernel_size=kernel_size, stride=stride, groups=groups)
        self.conv3 = nn.Sequential(
            nn.Conv2d(filters, 2*filters, kernel_size=1, stride=1, bias=False),
            nn.BatchNorm2d(2*filters)
        )
        self.relu = nn.ReLU(inplace=True)
    
    def forward(self, x):
        if self.shortcut:
            x2 = self.short(x)
        else:
            x2 = self.short(x)
        x1 = self.conv1(x)
        x1 = self.conv2(x1)
        x1 = self.conv3(x1)
        x = x1 + x2
        x = self.relu(x)
        return x

3.堆叠残差单元

每个stack的第一个block的输入和输出的shape是不一致的,所以残差连接都需要使用1*1卷积升维后才能进行Add操作。

而其他block的输入和输出的shape是一致的,所以可以直接执行Add操作。

pytorch

class Stack(nn.Module):
    def __init__(self, in_channel, filters, blocks, stride=2, groups=32):
        super(Stack, self).__init__()
        self.conv = nn.Sequential()
        self.conv.add_module(str(0), Block(in_channel, filters, stride=stride, groups=groups, conv_shortcut=True))
        for i in range(1, blocks):
            self.conv.add_module(str(i), Block(2*filters, filters, stride=1, groups=groups, conv_shortcut=False))
    
    def forward(self, x):
        x = self.conv(x)
        return x

4.搭建ResNext-50网络

pytorch

''' ResNeXt50 '''
class ResNeXt50(nn.Module):
    def __init__(self,
                 include_top=True,  # 是否包含位于网络顶部的全链接层
                 preact=False,  # 是否使用预激活
                 use_bias=True,  # 是否对卷积层使用偏置
                 input_shape=[32, 3, 224, 224],
                 classes=1000,
                 pooling=None):  # 用于分类图像的可选类数
        super(ResNeXt50, self).__init__()
        
        self.conv1 = nn.Sequential()
        self.conv1.add_module('conv', nn.Conv2d(3, 64, 7, stride=2, padding=3, bias=use_bias, padding_mode='zeros'))
        if not preact:
            self.conv1.add_module('bn', nn.BatchNorm2d(64))
            self.conv1.add_module('relu', nn.ReLU())
        self.conv1.add_module('max_pool', nn.MaxPool2d(kernel_size=3, stride=2, padding=1))
        
        self.conv2 = Stack(64, 128, 3, stride=1)
        self.conv3 = Stack(256, 256, 4, stride=2)
        self.conv4 = Stack(512, 512, 6, stride=2)
        self.conv5 = Stack(1024, 1024, 3, stride=2)
        
        self.post = nn.Sequential()
        if preact:
            self.post.add_module('bn', nn.BatchNorm2d(2048))
            self.post.add_module('relu', nn.ReLU())
        if include_top:
            self.post.add_module('avg_pool', nn.AdaptiveAvgPool2d((1, 1)))
            self.post.add_module('flatten', nn.Flatten())
            self.post.add_module('fc', nn.Linear(2048, classes))
        else:
            if pooling=='avg':
                self.post.add_module('avg_pool', nn.AdaptiveAvgPool2d((1, 1)))
            elif pooling=='max':
                self.post.add_module('max_pool', nn.AdaptiveMaxPool2d((1, 1)))
    
    def forward(self, x):
        x = self.conv1(x)
        x = self.conv2(x)
        x = self.conv3(x)
        x = self.conv4(x)
        x = self.conv5(x)
        x = self.post(x)
        return x

5.查看模型摘要

pytorch

''' 调用并将模型转移到GPU中(我们模型运行均在GPU中进行) '''
model = ResNeXt50(n_class=num_classes).to(device)
#model = ResNeXt50(n_class=num_classes).to(device)
''' 显示网络结构 '''
torchsummary.summary(model, (32, 3, 224, 224))
#torchinfo.summary(model)
print(model)
----------------------------------------------------------------
        Layer (type)               Output Shape         Param #
================================================================
            Conv2d-1         [-1, 64, 112, 112]           9,472
       BatchNorm2d-2         [-1, 64, 112, 112]             128
              ReLU-3         [-1, 64, 112, 112]               0
         MaxPool2d-4           [-1, 64, 56, 56]               0
            Conv2d-5          [-1, 256, 56, 56]          16,384
            Conv2d-6          [-1, 128, 56, 56]           8,192
       BatchNorm2d-7          [-1, 128, 56, 56]             256
              ReLU-8          [-1, 128, 56, 56]               0
            Conv2d-9            [-1, 4, 56, 56]             144
           Conv2d-10            [-1, 4, 56, 56]             144
           Conv2d-11            [-1, 4, 56, 56]             144
           Conv2d-12            [-1, 4, 56, 56]             144
           Conv2d-13            [-1, 4, 56, 56]             144
           Conv2d-14            [-1, 4, 56, 56]             144
           Conv2d-15            [-1, 4, 56, 56]             144
           Conv2d-16            [-1, 4, 56, 56]             144
           Conv2d-17            [-1, 4, 56, 56]             144
           Conv2d-18            [-1, 4, 56, 56]             144
           Conv2d-19            [-1, 4, 56, 56]             144
           Conv2d-20            [-1, 4, 56, 56]             144
           Conv2d-21            [-1, 4, 56, 56]             144
           Conv2d-22            [-1, 4, 56, 56]             144
           Conv2d-23            [-1, 4, 56, 56]             144
           Conv2d-24            [-1, 4, 56, 56]             144
           Conv2d-25            [-1, 4, 56, 56]             144
           Conv2d-26            [-1, 4, 56, 56]             144
           Conv2d-27            [-1, 4, 56, 56]             144
           Conv2d-28            [-1, 4, 56, 56]             144
           Conv2d-29            [-1, 4, 56, 56]             144
           Conv2d-30            [-1, 4, 56, 56]             144
           Conv2d-31            [-1, 4, 56, 56]             144
           Conv2d-32            [-1, 4, 56, 56]             144
           Conv2d-33            [-1, 4, 56, 56]             144
           Conv2d-34            [-1, 4, 56, 56]             144
           Conv2d-35            [-1, 4, 56, 56]             144
           Conv2d-36            [-1, 4, 56, 56]             144
           Conv2d-37            [-1, 4, 56, 56]             144
           Conv2d-38            [-1, 4, 56, 56]             144
           Conv2d-39            [-1, 4, 56, 56]             144
           Conv2d-40            [-1, 4, 56, 56]             144
      BatchNorm2d-41          [-1, 128, 56, 56]             256
             ReLU-42          [-1, 128, 56, 56]               0
 GroupedConvBlock-43          [-1, 128, 56, 56]               0
           Conv2d-44          [-1, 256, 56, 56]          32,768
      BatchNorm2d-45          [-1, 256, 56, 56]             512
             ReLU-46          [-1, 256, 56, 56]               0
            Block-47          [-1, 256, 56, 56]               0
         Identity-48          [-1, 256, 56, 56]               0
           Conv2d-49          [-1, 128, 56, 56]          32,768
      BatchNorm2d-50          [-1, 128, 56, 56]             256
             ReLU-51          [-1, 128, 56, 56]               0
           Conv2d-52            [-1, 4, 56, 56]             144
           Conv2d-53            [-1, 4, 56, 56]             144
           Conv2d-54            [-1, 4, 56, 56]             144
           Conv2d-55            [-1, 4, 56, 56]             144
           Conv2d-56            [-1, 4, 56, 56]             144
           Conv2d-57            [-1, 4, 56, 56]             144
           Conv2d-58            [-1, 4, 56, 56]             144
           Conv2d-59            [-1, 4, 56, 56]             144
           Conv2d-60            [-1, 4, 56, 56]             144
           Conv2d-61            [-1, 4, 56, 56]             144
           Conv2d-62            [-1, 4, 56, 56]             144
           Conv2d-63            [-1, 4, 56, 56]             144
           Conv2d-64            [-1, 4, 56, 56]             144
           Conv2d-65            [-1, 4, 56, 56]             144
           Conv2d-66            [-1, 4, 56, 56]             144
           Conv2d-67            [-1, 4, 56, 56]             144
           Conv2d-68            [-1, 4, 56, 56]             144
           Conv2d-69            [-1, 4, 56, 56]             144
           Conv2d-70            [-1, 4, 56, 56]             144
           Conv2d-71            [-1, 4, 56, 56]             144
           Conv2d-72            [-1, 4, 56, 56]             144
           Conv2d-73            [-1, 4, 56, 56]             144
           Conv2d-74            [-1, 4, 56, 56]             144
           Conv2d-75            [-1, 4, 56, 56]             144
           Conv2d-76            [-1, 4, 56, 56]             144
           Conv2d-77            [-1, 4, 56, 56]             144
           Conv2d-78            [-1, 4, 56, 56]             144
           Conv2d-79            [-1, 4, 56, 56]             144
           Conv2d-80            [-1, 4, 56, 56]             144
           Conv2d-81            [-1, 4, 56, 56]             144
           Conv2d-82            [-1, 4, 56, 56]             144
           Conv2d-83            [-1, 4, 56, 56]             144
      BatchNorm2d-84          [-1, 128, 56, 56]             256
             ReLU-85          [-1, 128, 56, 56]               0
 GroupedConvBlock-86          [-1, 128, 56, 56]               0
           Conv2d-87          [-1, 256, 56, 56]          32,768
      BatchNorm2d-88          [-1, 256, 56, 56]             512
             ReLU-89          [-1, 256, 56, 56]               0
            Block-90          [-1, 256, 56, 56]               0
         Identity-91          [-1, 256, 56, 56]               0
           Conv2d-92          [-1, 128, 56, 56]          32,768
      BatchNorm2d-93          [-1, 128, 56, 56]             256
             ReLU-94          [-1, 128, 56, 56]               0
           Conv2d-95            [-1, 4, 56, 56]             144
           Conv2d-96            [-1, 4, 56, 56]             144
           Conv2d-97            [-1, 4, 56, 56]             144
           Conv2d-98            [-1, 4, 56, 56]             144
           Conv2d-99            [-1, 4, 56, 56]             144
          Conv2d-100            [-1, 4, 56, 56]             144
          Conv2d-101            [-1, 4, 56, 56]             144
          Conv2d-102            [-1, 4, 56, 56]             144
          Conv2d-103            [-1, 4, 56, 56]             144
          Conv2d-104            [-1, 4, 56, 56]             144
          Conv2d-105            [-1, 4, 56, 56]             144
          Conv2d-106            [-1, 4, 56, 56]             144
          Conv2d-107            [-1, 4, 56, 56]             144
          Conv2d-108            [-1, 4, 56, 56]             144
          Conv2d-109            [-1, 4, 56, 56]             144
          Conv2d-110            [-1, 4, 56, 56]             144
          Conv2d-111            [-1, 4, 56, 56]             144
          Conv2d-112            [-1, 4, 56, 56]             144
          Conv2d-113            [-1, 4, 56, 56]             144
          Conv2d-114            [-1, 4, 56, 56]             144
          Conv2d-115            [-1, 4, 56, 56]             144
          Conv2d-116            [-1, 4, 56, 56]             144
          Conv2d-117            [-1, 4, 56, 56]             144
          Conv2d-118            [-1, 4, 56, 56]             144
          Conv2d-119            [-1, 4, 56, 56]             144
          Conv2d-120            [-1, 4, 56, 56]             144
          Conv2d-121            [-1, 4, 56, 56]             144
          Conv2d-122            [-1, 4, 56, 56]             144
          Conv2d-123            [-1, 4, 56, 56]             144
          Conv2d-124            [-1, 4, 56, 56]             144
          Conv2d-125            [-1, 4, 56, 56]             144
          Conv2d-126            [-1, 4, 56, 56]             144
     BatchNorm2d-127          [-1, 128, 56, 56]             256
            ReLU-128          [-1, 128, 56, 56]               0
GroupedConvBlock-129          [-1, 128, 56, 56]               0
          Conv2d-130          [-1, 256, 56, 56]          32,768
     BatchNorm2d-131          [-1, 256, 56, 56]             512
            ReLU-132          [-1, 256, 56, 56]               0
           Block-133          [-1, 256, 56, 56]               0
           Stack-134          [-1, 256, 56, 56]               0
          Conv2d-135          [-1, 512, 28, 28]         131,072
          Conv2d-136          [-1, 256, 56, 56]          65,536
     BatchNorm2d-137          [-1, 256, 56, 56]             512
            ReLU-138          [-1, 256, 56, 56]               0
          Conv2d-139            [-1, 8, 28, 28]             576
          Conv2d-140            [-1, 8, 28, 28]             576
          Conv2d-141            [-1, 8, 28, 28]             576
          Conv2d-142            [-1, 8, 28, 28]             576
          Conv2d-143            [-1, 8, 28, 28]             576
          Conv2d-144            [-1, 8, 28, 28]             576
          Conv2d-145            [-1, 8, 28, 28]             576
          Conv2d-146            [-1, 8, 28, 28]             576
          Conv2d-147            [-1, 8, 28, 28]             576
          Conv2d-148            [-1, 8, 28, 28]             576
          Conv2d-149            [-1, 8, 28, 28]             576
          Conv2d-150            [-1, 8, 28, 28]             576
          Conv2d-151            [-1, 8, 28, 28]             576
          Conv2d-152            [-1, 8, 28, 28]             576
          Conv2d-153            [-1, 8, 28, 28]             576
          Conv2d-154            [-1, 8, 28, 28]             576
          Conv2d-155            [-1, 8, 28, 28]             576
          Conv2d-156            [-1, 8, 28, 28]             576
          Conv2d-157            [-1, 8, 28, 28]             576
          Conv2d-158            [-1, 8, 28, 28]             576
          Conv2d-159            [-1, 8, 28, 28]             576
          Conv2d-160            [-1, 8, 28, 28]             576
          Conv2d-161            [-1, 8, 28, 28]             576
          Conv2d-162            [-1, 8, 28, 28]             576
          Conv2d-163            [-1, 8, 28, 28]             576
          Conv2d-164            [-1, 8, 28, 28]             576
          Conv2d-165            [-1, 8, 28, 28]             576
          Conv2d-166            [-1, 8, 28, 28]             576
          Conv2d-167            [-1, 8, 28, 28]             576
          Conv2d-168            [-1, 8, 28, 28]             576
          Conv2d-169            [-1, 8, 28, 28]             576
          Conv2d-170            [-1, 8, 28, 28]             576
     BatchNorm2d-171          [-1, 256, 28, 28]             512
            ReLU-172          [-1, 256, 28, 28]               0
GroupedConvBlock-173          [-1, 256, 28, 28]               0
          Conv2d-174          [-1, 512, 28, 28]         131,072
     BatchNorm2d-175          [-1, 512, 28, 28]           1,024
            ReLU-176          [-1, 512, 28, 28]               0
           Block-177          [-1, 512, 28, 28]               0
        Identity-178          [-1, 512, 28, 28]               0
          Conv2d-179          [-1, 256, 28, 28]         131,072
     BatchNorm2d-180          [-1, 256, 28, 28]             512
            ReLU-181          [-1, 256, 28, 28]               0
          Conv2d-182            [-1, 8, 28, 28]             576
          Conv2d-183            [-1, 8, 28, 28]             576
          Conv2d-184            [-1, 8, 28, 28]             576
          Conv2d-185            [-1, 8, 28, 28]             576
          Conv2d-186            [-1, 8, 28, 28]             576
          Conv2d-187            [-1, 8, 28, 28]             576
          Conv2d-188            [-1, 8, 28, 28]             576
          Conv2d-189            [-1, 8, 28, 28]             576
          Conv2d-190            [-1, 8, 28, 28]             576
          Conv2d-191            [-1, 8, 28, 28]             576
          Conv2d-192            [-1, 8, 28, 28]             576
          Conv2d-193            [-1, 8, 28, 28]             576
          Conv2d-194            [-1, 8, 28, 28]             576
          Conv2d-195            [-1, 8, 28, 28]             576
          Conv2d-196            [-1, 8, 28, 28]             576
          Conv2d-197            [-1, 8, 28, 28]             576
          Conv2d-198            [-1, 8, 28, 28]             576
          Conv2d-199            [-1, 8, 28, 28]             576
          Conv2d-200            [-1, 8, 28, 28]             576
          Conv2d-201            [-1, 8, 28, 28]             576
          Conv2d-202            [-1, 8, 28, 28]             576
          Conv2d-203            [-1, 8, 28, 28]             576
          Conv2d-204            [-1, 8, 28, 28]             576
          Conv2d-205            [-1, 8, 28, 28]             576
          Conv2d-206            [-1, 8, 28, 28]             576
          Conv2d-207            [-1, 8, 28, 28]             576
          Conv2d-208            [-1, 8, 28, 28]             576
          Conv2d-209            [-1, 8, 28, 28]             576
          Conv2d-210            [-1, 8, 28, 28]             576
          Conv2d-211            [-1, 8, 28, 28]             576
          Conv2d-212            [-1, 8, 28, 28]             576
          Conv2d-213            [-1, 8, 28, 28]             576
     BatchNorm2d-214          [-1, 256, 28, 28]             512
            ReLU-215          [-1, 256, 28, 28]               0
GroupedConvBlock-216          [-1, 256, 28, 28]               0
          Conv2d-217          [-1, 512, 28, 28]         131,072
     BatchNorm2d-218          [-1, 512, 28, 28]           1,024
            ReLU-219          [-1, 512, 28, 28]               0
           Block-220          [-1, 512, 28, 28]               0
        Identity-221          [-1, 512, 28, 28]               0
          Conv2d-222          [-1, 256, 28, 28]         131,072
     BatchNorm2d-223          [-1, 256, 28, 28]             512
            ReLU-224          [-1, 256, 28, 28]               0
          Conv2d-225            [-1, 8, 28, 28]             576
          Conv2d-226            [-1, 8, 28, 28]             576
          Conv2d-227            [-1, 8, 28, 28]             576
          Conv2d-228            [-1, 8, 28, 28]             576
          Conv2d-229            [-1, 8, 28, 28]             576
          Conv2d-230            [-1, 8, 28, 28]             576
          Conv2d-231            [-1, 8, 28, 28]             576
          Conv2d-232            [-1, 8, 28, 28]             576
          Conv2d-233            [-1, 8, 28, 28]             576
          Conv2d-234            [-1, 8, 28, 28]             576
          Conv2d-235            [-1, 8, 28, 28]             576
          Conv2d-236            [-1, 8, 28, 28]             576
          Conv2d-237            [-1, 8, 28, 28]             576
          Conv2d-238            [-1, 8, 28, 28]             576
          Conv2d-239            [-1, 8, 28, 28]             576
          Conv2d-240            [-1, 8, 28, 28]             576
          Conv2d-241            [-1, 8, 28, 28]             576
          Conv2d-242            [-1, 8, 28, 28]             576
          Conv2d-243            [-1, 8, 28, 28]             576
          Conv2d-244            [-1, 8, 28, 28]             576
          Conv2d-245            [-1, 8, 28, 28]             576
          Conv2d-246            [-1, 8, 28, 28]             576
          Conv2d-247            [-1, 8, 28, 28]             576
          Conv2d-248            [-1, 8, 28, 28]             576
          Conv2d-249            [-1, 8, 28, 28]             576
          Conv2d-250            [-1, 8, 28, 28]             576
          Conv2d-251            [-1, 8, 28, 28]             576
          Conv2d-252            [-1, 8, 28, 28]             576
          Conv2d-253            [-1, 8, 28, 28]             576
          Conv2d-254            [-1, 8, 28, 28]             576
          Conv2d-255            [-1, 8, 28, 28]             576
          Conv2d-256            [-1, 8, 28, 28]             576
     BatchNorm2d-257          [-1, 256, 28, 28]             512
            ReLU-258          [-1, 256, 28, 28]               0
GroupedConvBlock-259          [-1, 256, 28, 28]               0
          Conv2d-260          [-1, 512, 28, 28]         131,072
     BatchNorm2d-261          [-1, 512, 28, 28]           1,024
            ReLU-262          [-1, 512, 28, 28]               0
           Block-263          [-1, 512, 28, 28]               0
        Identity-264          [-1, 512, 28, 28]               0
          Conv2d-265          [-1, 256, 28, 28]         131,072
     BatchNorm2d-266          [-1, 256, 28, 28]             512
            ReLU-267          [-1, 256, 28, 28]               0
          Conv2d-268            [-1, 8, 28, 28]             576
          Conv2d-269            [-1, 8, 28, 28]             576
          Conv2d-270            [-1, 8, 28, 28]             576
          Conv2d-271            [-1, 8, 28, 28]             576
          Conv2d-272            [-1, 8, 28, 28]             576
          Conv2d-273            [-1, 8, 28, 28]             576
          Conv2d-274            [-1, 8, 28, 28]             576
          Conv2d-275            [-1, 8, 28, 28]             576
          Conv2d-276            [-1, 8, 28, 28]             576
          Conv2d-277            [-1, 8, 28, 28]             576
          Conv2d-278            [-1, 8, 28, 28]             576
          Conv2d-279            [-1, 8, 28, 28]             576
          Conv2d-280            [-1, 8, 28, 28]             576
          Conv2d-281            [-1, 8, 28, 28]             576
          Conv2d-282            [-1, 8, 28, 28]             576
          Conv2d-283            [-1, 8, 28, 28]             576
          Conv2d-284            [-1, 8, 28, 28]             576
          Conv2d-285            [-1, 8, 28, 28]             576
          Conv2d-286            [-1, 8, 28, 28]             576
          Conv2d-287            [-1, 8, 28, 28]             576
          Conv2d-288            [-1, 8, 28, 28]             576
          Conv2d-289            [-1, 8, 28, 28]             576
          Conv2d-290            [-1, 8, 28, 28]             576
          Conv2d-291            [-1, 8, 28, 28]             576
          Conv2d-292            [-1, 8, 28, 28]             576
          Conv2d-293            [-1, 8, 28, 28]             576
          Conv2d-294            [-1, 8, 28, 28]             576
          Conv2d-295            [-1, 8, 28, 28]             576
          Conv2d-296            [-1, 8, 28, 28]             576
          Conv2d-297            [-1, 8, 28, 28]             576
          Conv2d-298            [-1, 8, 28, 28]             576
          Conv2d-299            [-1, 8, 28, 28]             576
     BatchNorm2d-300          [-1, 256, 28, 28]             512
            ReLU-301          [-1, 256, 28, 28]               0
GroupedConvBlock-302          [-1, 256, 28, 28]               0
          Conv2d-303          [-1, 512, 28, 28]         131,072
     BatchNorm2d-304          [-1, 512, 28, 28]           1,024
            ReLU-305          [-1, 512, 28, 28]               0
           Block-306          [-1, 512, 28, 28]               0
           Stack-307          [-1, 512, 28, 28]               0
          Conv2d-308         [-1, 1024, 14, 14]         524,288
          Conv2d-309          [-1, 512, 28, 28]         262,144
     BatchNorm2d-310          [-1, 512, 28, 28]           1,024
            ReLU-311          [-1, 512, 28, 28]               0
          Conv2d-312           [-1, 16, 14, 14]           2,304
          Conv2d-313           [-1, 16, 14, 14]           2,304
          Conv2d-314           [-1, 16, 14, 14]           2,304
          Conv2d-315           [-1, 16, 14, 14]           2,304
          Conv2d-316           [-1, 16, 14, 14]           2,304
          Conv2d-317           [-1, 16, 14, 14]           2,304
          Conv2d-318           [-1, 16, 14, 14]           2,304
          Conv2d-319           [-1, 16, 14, 14]           2,304
          Conv2d-320           [-1, 16, 14, 14]           2,304
          Conv2d-321           [-1, 16, 14, 14]           2,304
          Conv2d-322           [-1, 16, 14, 14]           2,304
          Conv2d-323           [-1, 16, 14, 14]           2,304
          Conv2d-324           [-1, 16, 14, 14]           2,304
          Conv2d-325           [-1, 16, 14, 14]           2,304
          Conv2d-326           [-1, 16, 14, 14]           2,304
          Conv2d-327           [-1, 16, 14, 14]           2,304
          Conv2d-328           [-1, 16, 14, 14]           2,304
          Conv2d-329           [-1, 16, 14, 14]           2,304
          Conv2d-330           [-1, 16, 14, 14]           2,304
          Conv2d-331           [-1, 16, 14, 14]           2,304
          Conv2d-332           [-1, 16, 14, 14]           2,304
          Conv2d-333           [-1, 16, 14, 14]           2,304
          Conv2d-334           [-1, 16, 14, 14]           2,304
          Conv2d-335           [-1, 16, 14, 14]           2,304
          Conv2d-336           [-1, 16, 14, 14]           2,304
          Conv2d-337           [-1, 16, 14, 14]           2,304
          Conv2d-338           [-1, 16, 14, 14]           2,304
          Conv2d-339           [-1, 16, 14, 14]           2,304
          Conv2d-340           [-1, 16, 14, 14]           2,304
          Conv2d-341           [-1, 16, 14, 14]           2,304
          Conv2d-342           [-1, 16, 14, 14]           2,304
          Conv2d-343           [-1, 16, 14, 14]           2,304
     BatchNorm2d-344          [-1, 512, 14, 14]           1,024
            ReLU-345          [-1, 512, 14, 14]               0
GroupedConvBlock-346          [-1, 512, 14, 14]               0
          Conv2d-347         [-1, 1024, 14, 14]         524,288
     BatchNorm2d-348         [-1, 1024, 14, 14]           2,048
            ReLU-349         [-1, 1024, 14, 14]               0
           Block-350         [-1, 1024, 14, 14]               0
        Identity-351         [-1, 1024, 14, 14]               0
          Conv2d-352          [-1, 512, 14, 14]         524,288
     BatchNorm2d-353          [-1, 512, 14, 14]           1,024
            ReLU-354          [-1, 512, 14, 14]               0
          Conv2d-355           [-1, 16, 14, 14]           2,304
          Conv2d-356           [-1, 16, 14, 14]           2,304
          Conv2d-357           [-1, 16, 14, 14]           2,304
          Conv2d-358           [-1, 16, 14, 14]           2,304
          Conv2d-359           [-1, 16, 14, 14]           2,304
          Conv2d-360           [-1, 16, 14, 14]           2,304
          Conv2d-361           [-1, 16, 14, 14]           2,304
          Conv2d-362           [-1, 16, 14, 14]           2,304
          Conv2d-363           [-1, 16, 14, 14]           2,304
          Conv2d-364           [-1, 16, 14, 14]           2,304
          Conv2d-365           [-1, 16, 14, 14]           2,304
          Conv2d-366           [-1, 16, 14, 14]           2,304
          Conv2d-367           [-1, 16, 14, 14]           2,304
          Conv2d-368           [-1, 16, 14, 14]           2,304
          Conv2d-369           [-1, 16, 14, 14]           2,304
          Conv2d-370           [-1, 16, 14, 14]           2,304
          Conv2d-371           [-1, 16, 14, 14]           2,304
          Conv2d-372           [-1, 16, 14, 14]           2,304
          Conv2d-373           [-1, 16, 14, 14]           2,304
          Conv2d-374           [-1, 16, 14, 14]           2,304
          Conv2d-375           [-1, 16, 14, 14]           2,304
          Conv2d-376           [-1, 16, 14, 14]           2,304
          Conv2d-377           [-1, 16, 14, 14]           2,304
          Conv2d-378           [-1, 16, 14, 14]           2,304
          Conv2d-379           [-1, 16, 14, 14]           2,304
          Conv2d-380           [-1, 16, 14, 14]           2,304
          Conv2d-381           [-1, 16, 14, 14]           2,304
          Conv2d-382           [-1, 16, 14, 14]           2,304
          Conv2d-383           [-1, 16, 14, 14]           2,304
          Conv2d-384           [-1, 16, 14, 14]           2,304
          Conv2d-385           [-1, 16, 14, 14]           2,304
          Conv2d-386           [-1, 16, 14, 14]           2,304
     BatchNorm2d-387          [-1, 512, 14, 14]           1,024
            ReLU-388          [-1, 512, 14, 14]               0
GroupedConvBlock-389          [-1, 512, 14, 14]               0
          Conv2d-390         [-1, 1024, 14, 14]         524,288
     BatchNorm2d-391         [-1, 1024, 14, 14]           2,048
            ReLU-392         [-1, 1024, 14, 14]               0
           Block-393         [-1, 1024, 14, 14]               0
        Identity-394         [-1, 1024, 14, 14]               0
          Conv2d-395          [-1, 512, 14, 14]         524,288
     BatchNorm2d-396          [-1, 512, 14, 14]           1,024
            ReLU-397          [-1, 512, 14, 14]               0
          Conv2d-398           [-1, 16, 14, 14]           2,304
          Conv2d-399           [-1, 16, 14, 14]           2,304
          Conv2d-400           [-1, 16, 14, 14]           2,304
          Conv2d-401           [-1, 16, 14, 14]           2,304
          Conv2d-402           [-1, 16, 14, 14]           2,304
          Conv2d-403           [-1, 16, 14, 14]           2,304
          Conv2d-404           [-1, 16, 14, 14]           2,304
          Conv2d-405           [-1, 16, 14, 14]           2,304
          Conv2d-406           [-1, 16, 14, 14]           2,304
          Conv2d-407           [-1, 16, 14, 14]           2,304
          Conv2d-408           [-1, 16, 14, 14]           2,304
          Conv2d-409           [-1, 16, 14, 14]           2,304
          Conv2d-410           [-1, 16, 14, 14]           2,304
          Conv2d-411           [-1, 16, 14, 14]           2,304
          Conv2d-412           [-1, 16, 14, 14]           2,304
          Conv2d-413           [-1, 16, 14, 14]           2,304
          Conv2d-414           [-1, 16, 14, 14]           2,304
          Conv2d-415           [-1, 16, 14, 14]           2,304
          Conv2d-416           [-1, 16, 14, 14]           2,304
          Conv2d-417           [-1, 16, 14, 14]           2,304
          Conv2d-418           [-1, 16, 14, 14]           2,304
          Conv2d-419           [-1, 16, 14, 14]           2,304
          Conv2d-420           [-1, 16, 14, 14]           2,304
          Conv2d-421           [-1, 16, 14, 14]           2,304
          Conv2d-422           [-1, 16, 14, 14]           2,304
          Conv2d-423           [-1, 16, 14, 14]           2,304
          Conv2d-424           [-1, 16, 14, 14]           2,304
          Conv2d-425           [-1, 16, 14, 14]           2,304
          Conv2d-426           [-1, 16, 14, 14]           2,304
          Conv2d-427           [-1, 16, 14, 14]           2,304
          Conv2d-428           [-1, 16, 14, 14]           2,304
          Conv2d-429           [-1, 16, 14, 14]           2,304
     BatchNorm2d-430          [-1, 512, 14, 14]           1,024
            ReLU-431          [-1, 512, 14, 14]               0
GroupedConvBlock-432          [-1, 512, 14, 14]               0
          Conv2d-433         [-1, 1024, 14, 14]         524,288
     BatchNorm2d-434         [-1, 1024, 14, 14]           2,048
            ReLU-435         [-1, 1024, 14, 14]               0
           Block-436         [-1, 1024, 14, 14]               0
        Identity-437         [-1, 1024, 14, 14]               0
          Conv2d-438          [-1, 512, 14, 14]         524,288
     BatchNorm2d-439          [-1, 512, 14, 14]           1,024
            ReLU-440          [-1, 512, 14, 14]               0
          Conv2d-441           [-1, 16, 14, 14]           2,304
          Conv2d-442           [-1, 16, 14, 14]           2,304
          Conv2d-443           [-1, 16, 14, 14]           2,304
          Conv2d-444           [-1, 16, 14, 14]           2,304
          Conv2d-445           [-1, 16, 14, 14]           2,304
          Conv2d-446           [-1, 16, 14, 14]           2,304
          Conv2d-447           [-1, 16, 14, 14]           2,304
          Conv2d-448           [-1, 16, 14, 14]           2,304
          Conv2d-449           [-1, 16, 14, 14]           2,304
          Conv2d-450           [-1, 16, 14, 14]           2,304
          Conv2d-451           [-1, 16, 14, 14]           2,304
          Conv2d-452           [-1, 16, 14, 14]           2,304
          Conv2d-453           [-1, 16, 14, 14]           2,304
          Conv2d-454           [-1, 16, 14, 14]           2,304
          Conv2d-455           [-1, 16, 14, 14]           2,304
          Conv2d-456           [-1, 16, 14, 14]           2,304
          Conv2d-457           [-1, 16, 14, 14]           2,304
          Conv2d-458           [-1, 16, 14, 14]           2,304
          Conv2d-459           [-1, 16, 14, 14]           2,304
          Conv2d-460           [-1, 16, 14, 14]           2,304
          Conv2d-461           [-1, 16, 14, 14]           2,304
          Conv2d-462           [-1, 16, 14, 14]           2,304
          Conv2d-463           [-1, 16, 14, 14]           2,304
          Conv2d-464           [-1, 16, 14, 14]           2,304
          Conv2d-465           [-1, 16, 14, 14]           2,304
          Conv2d-466           [-1, 16, 14, 14]           2,304
          Conv2d-467           [-1, 16, 14, 14]           2,304
          Conv2d-468           [-1, 16, 14, 14]           2,304
          Conv2d-469           [-1, 16, 14, 14]           2,304
          Conv2d-470           [-1, 16, 14, 14]           2,304
          Conv2d-471           [-1, 16, 14, 14]           2,304
          Conv2d-472           [-1, 16, 14, 14]           2,304
     BatchNorm2d-473          [-1, 512, 14, 14]           1,024
            ReLU-474          [-1, 512, 14, 14]               0
GroupedConvBlock-475          [-1, 512, 14, 14]               0
          Conv2d-476         [-1, 1024, 14, 14]         524,288
     BatchNorm2d-477         [-1, 1024, 14, 14]           2,048
            ReLU-478         [-1, 1024, 14, 14]               0
           Block-479         [-1, 1024, 14, 14]               0
        Identity-480         [-1, 1024, 14, 14]               0
          Conv2d-481          [-1, 512, 14, 14]         524,288
     BatchNorm2d-482          [-1, 512, 14, 14]           1,024
            ReLU-483          [-1, 512, 14, 14]               0
          Conv2d-484           [-1, 16, 14, 14]           2,304
          Conv2d-485           [-1, 16, 14, 14]           2,304
          Conv2d-486           [-1, 16, 14, 14]           2,304
          Conv2d-487           [-1, 16, 14, 14]           2,304
          Conv2d-488           [-1, 16, 14, 14]           2,304
          Conv2d-489           [-1, 16, 14, 14]           2,304
          Conv2d-490           [-1, 16, 14, 14]           2,304
          Conv2d-491           [-1, 16, 14, 14]           2,304
          Conv2d-492           [-1, 16, 14, 14]           2,304
          Conv2d-493           [-1, 16, 14, 14]           2,304
          Conv2d-494           [-1, 16, 14, 14]           2,304
          Conv2d-495           [-1, 16, 14, 14]           2,304
          Conv2d-496           [-1, 16, 14, 14]           2,304
          Conv2d-497           [-1, 16, 14, 14]           2,304
          Conv2d-498           [-1, 16, 14, 14]           2,304
          Conv2d-499           [-1, 16, 14, 14]           2,304
          Conv2d-500           [-1, 16, 14, 14]           2,304
          Conv2d-501           [-1, 16, 14, 14]           2,304
          Conv2d-502           [-1, 16, 14, 14]           2,304
          Conv2d-503           [-1, 16, 14, 14]           2,304
          Conv2d-504           [-1, 16, 14, 14]           2,304
          Conv2d-505           [-1, 16, 14, 14]           2,304
          Conv2d-506           [-1, 16, 14, 14]           2,304
          Conv2d-507           [-1, 16, 14, 14]           2,304
          Conv2d-508           [-1, 16, 14, 14]           2,304
          Conv2d-509           [-1, 16, 14, 14]           2,304
          Conv2d-510           [-1, 16, 14, 14]           2,304
          Conv2d-511           [-1, 16, 14, 14]           2,304
          Conv2d-512           [-1, 16, 14, 14]           2,304
          Conv2d-513           [-1, 16, 14, 14]           2,304
          Conv2d-514           [-1, 16, 14, 14]           2,304
          Conv2d-515           [-1, 16, 14, 14]           2,304
     BatchNorm2d-516          [-1, 512, 14, 14]           1,024
            ReLU-517          [-1, 512, 14, 14]               0
GroupedConvBlock-518          [-1, 512, 14, 14]               0
          Conv2d-519         [-1, 1024, 14, 14]         524,288
     BatchNorm2d-520         [-1, 1024, 14, 14]           2,048
            ReLU-521         [-1, 1024, 14, 14]               0
           Block-522         [-1, 1024, 14, 14]               0
        Identity-523         [-1, 1024, 14, 14]               0
          Conv2d-524          [-1, 512, 14, 14]         524,288
     BatchNorm2d-525          [-1, 512, 14, 14]           1,024
            ReLU-526          [-1, 512, 14, 14]               0
          Conv2d-527           [-1, 16, 14, 14]           2,304
          Conv2d-528           [-1, 16, 14, 14]           2,304
          Conv2d-529           [-1, 16, 14, 14]           2,304
          Conv2d-530           [-1, 16, 14, 14]           2,304
          Conv2d-531           [-1, 16, 14, 14]           2,304
          Conv2d-532           [-1, 16, 14, 14]           2,304
          Conv2d-533           [-1, 16, 14, 14]           2,304
          Conv2d-534           [-1, 16, 14, 14]           2,304
          Conv2d-535           [-1, 16, 14, 14]           2,304
          Conv2d-536           [-1, 16, 14, 14]           2,304
          Conv2d-537           [-1, 16, 14, 14]           2,304
          Conv2d-538           [-1, 16, 14, 14]           2,304
          Conv2d-539           [-1, 16, 14, 14]           2,304
          Conv2d-540           [-1, 16, 14, 14]           2,304
          Conv2d-541           [-1, 16, 14, 14]           2,304
          Conv2d-542           [-1, 16, 14, 14]           2,304
          Conv2d-543           [-1, 16, 14, 14]           2,304
          Conv2d-544           [-1, 16, 14, 14]           2,304
          Conv2d-545           [-1, 16, 14, 14]           2,304
          Conv2d-546           [-1, 16, 14, 14]           2,304
          Conv2d-547           [-1, 16, 14, 14]           2,304
          Conv2d-548           [-1, 16, 14, 14]           2,304
          Conv2d-549           [-1, 16, 14, 14]           2,304
          Conv2d-550           [-1, 16, 14, 14]           2,304
          Conv2d-551           [-1, 16, 14, 14]           2,304
          Conv2d-552           [-1, 16, 14, 14]           2,304
          Conv2d-553           [-1, 16, 14, 14]           2,304
          Conv2d-554           [-1, 16, 14, 14]           2,304
          Conv2d-555           [-1, 16, 14, 14]           2,304
          Conv2d-556           [-1, 16, 14, 14]           2,304
          Conv2d-557           [-1, 16, 14, 14]           2,304
          Conv2d-558           [-1, 16, 14, 14]           2,304
     BatchNorm2d-559          [-1, 512, 14, 14]           1,024
            ReLU-560          [-1, 512, 14, 14]               0
GroupedConvBlock-561          [-1, 512, 14, 14]               0
          Conv2d-562         [-1, 1024, 14, 14]         524,288
     BatchNorm2d-563         [-1, 1024, 14, 14]           2,048
            ReLU-564         [-1, 1024, 14, 14]               0
           Block-565         [-1, 1024, 14, 14]               0
           Stack-566         [-1, 1024, 14, 14]               0
          Conv2d-567           [-1, 2048, 7, 7]       2,097,152
          Conv2d-568         [-1, 1024, 14, 14]       1,048,576
     BatchNorm2d-569         [-1, 1024, 14, 14]           2,048
            ReLU-570         [-1, 1024, 14, 14]               0
          Conv2d-571             [-1, 32, 7, 7]           9,216
          Conv2d-572             [-1, 32, 7, 7]           9,216
          Conv2d-573             [-1, 32, 7, 7]           9,216
          Conv2d-574             [-1, 32, 7, 7]           9,216
          Conv2d-575             [-1, 32, 7, 7]           9,216
          Conv2d-576             [-1, 32, 7, 7]           9,216
          Conv2d-577             [-1, 32, 7, 7]           9,216
          Conv2d-578             [-1, 32, 7, 7]           9,216
          Conv2d-579             [-1, 32, 7, 7]           9,216
          Conv2d-580             [-1, 32, 7, 7]           9,216
          Conv2d-581             [-1, 32, 7, 7]           9,216
          Conv2d-582             [-1, 32, 7, 7]           9,216
          Conv2d-583             [-1, 32, 7, 7]           9,216
          Conv2d-584             [-1, 32, 7, 7]           9,216
          Conv2d-585             [-1, 32, 7, 7]           9,216
          Conv2d-586             [-1, 32, 7, 7]           9,216
          Conv2d-587             [-1, 32, 7, 7]           9,216
          Conv2d-588             [-1, 32, 7, 7]           9,216
          Conv2d-589             [-1, 32, 7, 7]           9,216
          Conv2d-590             [-1, 32, 7, 7]           9,216
          Conv2d-591             [-1, 32, 7, 7]           9,216
          Conv2d-592             [-1, 32, 7, 7]           9,216
          Conv2d-593             [-1, 32, 7, 7]           9,216
          Conv2d-594             [-1, 32, 7, 7]           9,216
          Conv2d-595             [-1, 32, 7, 7]           9,216
          Conv2d-596             [-1, 32, 7, 7]           9,216
          Conv2d-597             [-1, 32, 7, 7]           9,216
          Conv2d-598             [-1, 32, 7, 7]           9,216
          Conv2d-599             [-1, 32, 7, 7]           9,216
          Conv2d-600             [-1, 32, 7, 7]           9,216
          Conv2d-601             [-1, 32, 7, 7]           9,216
          Conv2d-602             [-1, 32, 7, 7]           9,216
     BatchNorm2d-603           [-1, 1024, 7, 7]           2,048
            ReLU-604           [-1, 1024, 7, 7]               0
GroupedConvBlock-605           [-1, 1024, 7, 7]               0
          Conv2d-606           [-1, 2048, 7, 7]       2,097,152
     BatchNorm2d-607           [-1, 2048, 7, 7]           4,096
            ReLU-608           [-1, 2048, 7, 7]               0
           Block-609           [-1, 2048, 7, 7]               0
        Identity-610           [-1, 2048, 7, 7]               0
          Conv2d-611           [-1, 1024, 7, 7]       2,097,152
     BatchNorm2d-612           [-1, 1024, 7, 7]           2,048
            ReLU-613           [-1, 1024, 7, 7]               0
          Conv2d-614             [-1, 32, 7, 7]           9,216
          Conv2d-615             [-1, 32, 7, 7]           9,216
          Conv2d-616             [-1, 32, 7, 7]           9,216
          Conv2d-617             [-1, 32, 7, 7]           9,216
          Conv2d-618             [-1, 32, 7, 7]           9,216
          Conv2d-619             [-1, 32, 7, 7]           9,216
          Conv2d-620             [-1, 32, 7, 7]           9,216
          Conv2d-621             [-1, 32, 7, 7]           9,216
          Conv2d-622             [-1, 32, 7, 7]           9,216
          Conv2d-623             [-1, 32, 7, 7]           9,216
          Conv2d-624             [-1, 32, 7, 7]           9,216
          Conv2d-625             [-1, 32, 7, 7]           9,216
          Conv2d-626             [-1, 32, 7, 7]           9,216
          Conv2d-627             [-1, 32, 7, 7]           9,216
          Conv2d-628             [-1, 32, 7, 7]           9,216
          Conv2d-629             [-1, 32, 7, 7]           9,216
          Conv2d-630             [-1, 32, 7, 7]           9,216
          Conv2d-631             [-1, 32, 7, 7]           9,216
          Conv2d-632             [-1, 32, 7, 7]           9,216
          Conv2d-633             [-1, 32, 7, 7]           9,216
          Conv2d-634             [-1, 32, 7, 7]           9,216
          Conv2d-635             [-1, 32, 7, 7]           9,216
          Conv2d-636             [-1, 32, 7, 7]           9,216
          Conv2d-637             [-1, 32, 7, 7]           9,216
          Conv2d-638             [-1, 32, 7, 7]           9,216
          Conv2d-639             [-1, 32, 7, 7]           9,216
          Conv2d-640             [-1, 32, 7, 7]           9,216
          Conv2d-641             [-1, 32, 7, 7]           9,216
          Conv2d-642             [-1, 32, 7, 7]           9,216
          Conv2d-643             [-1, 32, 7, 7]           9,216
          Conv2d-644             [-1, 32, 7, 7]           9,216
          Conv2d-645             [-1, 32, 7, 7]           9,216
     BatchNorm2d-646           [-1, 1024, 7, 7]           2,048
            ReLU-647           [-1, 1024, 7, 7]               0
GroupedConvBlock-648           [-1, 1024, 7, 7]               0
          Conv2d-649           [-1, 2048, 7, 7]       2,097,152
     BatchNorm2d-650           [-1, 2048, 7, 7]           4,096
            ReLU-651           [-1, 2048, 7, 7]               0
           Block-652           [-1, 2048, 7, 7]               0
        Identity-653           [-1, 2048, 7, 7]               0
          Conv2d-654           [-1, 1024, 7, 7]       2,097,152
     BatchNorm2d-655           [-1, 1024, 7, 7]           2,048
            ReLU-656           [-1, 1024, 7, 7]               0
          Conv2d-657             [-1, 32, 7, 7]           9,216
          Conv2d-658             [-1, 32, 7, 7]           9,216
          Conv2d-659             [-1, 32, 7, 7]           9,216
          Conv2d-660             [-1, 32, 7, 7]           9,216
          Conv2d-661             [-1, 32, 7, 7]           9,216
          Conv2d-662             [-1, 32, 7, 7]           9,216
          Conv2d-663             [-1, 32, 7, 7]           9,216
          Conv2d-664             [-1, 32, 7, 7]           9,216
          Conv2d-665             [-1, 32, 7, 7]           9,216
          Conv2d-666             [-1, 32, 7, 7]           9,216
          Conv2d-667             [-1, 32, 7, 7]           9,216
          Conv2d-668             [-1, 32, 7, 7]           9,216
          Conv2d-669             [-1, 32, 7, 7]           9,216
          Conv2d-670             [-1, 32, 7, 7]           9,216
          Conv2d-671             [-1, 32, 7, 7]           9,216
          Conv2d-672             [-1, 32, 7, 7]           9,216
          Conv2d-673             [-1, 32, 7, 7]           9,216
          Conv2d-674             [-1, 32, 7, 7]           9,216
          Conv2d-675             [-1, 32, 7, 7]           9,216
          Conv2d-676             [-1, 32, 7, 7]           9,216
          Conv2d-677             [-1, 32, 7, 7]           9,216
          Conv2d-678             [-1, 32, 7, 7]           9,216
          Conv2d-679             [-1, 32, 7, 7]           9,216
          Conv2d-680             [-1, 32, 7, 7]           9,216
          Conv2d-681             [-1, 32, 7, 7]           9,216
          Conv2d-682             [-1, 32, 7, 7]           9,216
          Conv2d-683             [-1, 32, 7, 7]           9,216
          Conv2d-684             [-1, 32, 7, 7]           9,216
          Conv2d-685             [-1, 32, 7, 7]           9,216
          Conv2d-686             [-1, 32, 7, 7]           9,216
          Conv2d-687             [-1, 32, 7, 7]           9,216
          Conv2d-688             [-1, 32, 7, 7]           9,216
     BatchNorm2d-689           [-1, 1024, 7, 7]           2,048
            ReLU-690           [-1, 1024, 7, 7]               0
GroupedConvBlock-691           [-1, 1024, 7, 7]               0
          Conv2d-692           [-1, 2048, 7, 7]       2,097,152
     BatchNorm2d-693           [-1, 2048, 7, 7]           4,096
            ReLU-694           [-1, 2048, 7, 7]               0
           Block-695           [-1, 2048, 7, 7]               0
           Stack-696           [-1, 2048, 7, 7]               0
AdaptiveAvgPool2d-697           [-1, 2048, 1, 1]               0
         Flatten-698                 [-1, 2048]               0
          Linear-699                    [-1, 2]           4,098
================================================================
Total params: 22,976,386
Trainable params: 22,976,386
Non-trainable params: 0
----------------------------------------------------------------
Input size (MB): 0.57
Forward/backward pass size (MB): 413.47
Params size (MB): 87.65
Estimated Total Size (MB): 501.69
----------------------------------------------------------------
ResNeXt50(
  (conv1): Sequential(
    (conv): Conv2d(3, 64, kernel_size=(7, 7), stride=(2, 2), padding=(3, 3))
    (bn): BatchNorm2d(64, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
    (relu): ReLU()
    (max_pool): MaxPool2d(kernel_size=3, stride=2, padding=1, dilation=1, ceil_mode=False)
  )
  (conv2): Stack(
    (conv): Sequential(
      (0): Block(
        (short): Conv2d(64, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
        (conv1): Sequential(
          (0): Conv2d(64, 128, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (2): ReLU(inplace=True)
        )
        (conv2): GroupedConvBlock(
          (conv): Conv2d(4, 4, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (norm): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (conv3): Sequential(
          (0): Conv2d(128, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        )
        (relu): ReLU(inplace=True)
      )
      (1): Block(
        (short): Identity()
        (conv1): Sequential(
          (0): Conv2d(256, 128, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (2): ReLU(inplace=True)
        )
        (conv2): GroupedConvBlock(
          (conv): Conv2d(4, 4, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (norm): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (conv3): Sequential(
          (0): Conv2d(128, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        )
        (relu): ReLU(inplace=True)
      )
      (2): Block(
        (short): Identity()
        (conv1): Sequential(
          (0): Conv2d(256, 128, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (2): ReLU(inplace=True)
        )
        (conv2): GroupedConvBlock(
          (conv): Conv2d(4, 4, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (norm): BatchNorm2d(128, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (conv3): Sequential(
          (0): Conv2d(128, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        )
        (relu): ReLU(inplace=True)
      )
    )
  )
  (conv3): Stack(
    (conv): Sequential(
      (0): Block(
        (short): Conv2d(256, 512, kernel_size=(1, 1), stride=(2, 2), bias=False)
        (conv1): Sequential(
          (0): Conv2d(256, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (2): ReLU(inplace=True)
        )
        (conv2): GroupedConvBlock(
          (conv): Conv2d(8, 8, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
          (norm): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (conv3): Sequential(
          (0): Conv2d(256, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        )
        (relu): ReLU(inplace=True)
      )
      (1): Block(
        (short): Identity()
        (conv1): Sequential(
          (0): Conv2d(512, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (2): ReLU(inplace=True)
        )
        (conv2): GroupedConvBlock(
          (conv): Conv2d(8, 8, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (norm): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (conv3): Sequential(
          (0): Conv2d(256, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        )
        (relu): ReLU(inplace=True)
      )
      (2): Block(
        (short): Identity()
        (conv1): Sequential(
          (0): Conv2d(512, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (2): ReLU(inplace=True)
        )
        (conv2): GroupedConvBlock(
          (conv): Conv2d(8, 8, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (norm): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (conv3): Sequential(
          (0): Conv2d(256, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        )
        (relu): ReLU(inplace=True)
      )
      (3): Block(
        (short): Identity()
        (conv1): Sequential(
          (0): Conv2d(512, 256, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (2): ReLU(inplace=True)
        )
        (conv2): GroupedConvBlock(
          (conv): Conv2d(8, 8, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (norm): BatchNorm2d(256, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (conv3): Sequential(
          (0): Conv2d(256, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        )
        (relu): ReLU(inplace=True)
      )
    )
  )
  (conv4): Stack(
    (conv): Sequential(
      (0): Block(
        (short): Conv2d(512, 1024, kernel_size=(1, 1), stride=(2, 2), bias=False)
        (conv1): Sequential(
          (0): Conv2d(512, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (2): ReLU(inplace=True)
        )
        (conv2): GroupedConvBlock(
          (conv): Conv2d(16, 16, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
          (norm): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (conv3): Sequential(
          (0): Conv2d(512, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        )
        (relu): ReLU(inplace=True)
      )
      (1): Block(
        (short): Identity()
        (conv1): Sequential(
          (0): Conv2d(1024, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (2): ReLU(inplace=True)
        )
        (conv2): GroupedConvBlock(
          (conv): Conv2d(16, 16, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (norm): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (conv3): Sequential(
          (0): Conv2d(512, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        )
        (relu): ReLU(inplace=True)
      )
      (2): Block(
        (short): Identity()
        (conv1): Sequential(
          (0): Conv2d(1024, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (2): ReLU(inplace=True)
        )
        (conv2): GroupedConvBlock(
          (conv): Conv2d(16, 16, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (norm): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (conv3): Sequential(
          (0): Conv2d(512, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        )
        (relu): ReLU(inplace=True)
      )
      (3): Block(
        (short): Identity()
        (conv1): Sequential(
          (0): Conv2d(1024, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (2): ReLU(inplace=True)
        )
        (conv2): GroupedConvBlock(
          (conv): Conv2d(16, 16, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (norm): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (conv3): Sequential(
          (0): Conv2d(512, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        )
        (relu): ReLU(inplace=True)
      )
      (4): Block(
        (short): Identity()
        (conv1): Sequential(
          (0): Conv2d(1024, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (2): ReLU(inplace=True)
        )
        (conv2): GroupedConvBlock(
          (conv): Conv2d(16, 16, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (norm): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (conv3): Sequential(
          (0): Conv2d(512, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        )
        (relu): ReLU(inplace=True)
      )
      (5): Block(
        (short): Identity()
        (conv1): Sequential(
          (0): Conv2d(1024, 512, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (2): ReLU(inplace=True)
        )
        (conv2): GroupedConvBlock(
          (conv): Conv2d(16, 16, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (norm): BatchNorm2d(512, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (conv3): Sequential(
          (0): Conv2d(512, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        )
        (relu): ReLU(inplace=True)
      )
    )
  )
  (conv5): Stack(
    (conv): Sequential(
      (0): Block(
        (short): Conv2d(1024, 2048, kernel_size=(1, 1), stride=(2, 2), bias=False)
        (conv1): Sequential(
          (0): Conv2d(1024, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (2): ReLU(inplace=True)
        )
        (conv2): GroupedConvBlock(
          (conv): Conv2d(32, 32, kernel_size=(3, 3), stride=(2, 2), padding=(1, 1), bias=False)
          (norm): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (conv3): Sequential(
          (0): Conv2d(1024, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(2048, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        )
        (relu): ReLU(inplace=True)
      )
      (1): Block(
        (short): Identity()
        (conv1): Sequential(
          (0): Conv2d(2048, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (2): ReLU(inplace=True)
        )
        (conv2): GroupedConvBlock(
          (conv): Conv2d(32, 32, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (norm): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (conv3): Sequential(
          (0): Conv2d(1024, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(2048, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        )
        (relu): ReLU(inplace=True)
      )
      (2): Block(
        (short): Identity()
        (conv1): Sequential(
          (0): Conv2d(2048, 1024, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (2): ReLU(inplace=True)
        )
        (conv2): GroupedConvBlock(
          (conv): Conv2d(32, 32, kernel_size=(3, 3), stride=(1, 1), padding=(1, 1), bias=False)
          (norm): BatchNorm2d(1024, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
          (relu): ReLU(inplace=True)
        )
        (conv3): Sequential(
          (0): Conv2d(1024, 2048, kernel_size=(1, 1), stride=(1, 1), bias=False)
          (1): BatchNorm2d(2048, eps=1e-05, momentum=0.1, affine=True, track_running_stats=True)
        )
        (relu): ReLU(inplace=True)
      )
    )
  )
  (post): Sequential(
    (avg_pool): AdaptiveAvgPool2d(output_size=(1, 1))
    (flatten): Flatten(start_dim=1, end_dim=-1)
    (fc): Linear(in_features=2048, out_features=2, bias=True)
  )
)

你可能感兴趣的:(365天深度学习训练记录,cnn,算法,深度学习)