神经网络模型感受野计算工具
caffe模型可视化工具 netscope
左边放置.Prototxt 文本, 使用快捷键 shift + enter 可以绘制网络结构图
可视化工具 ConvNetDraw
利用pytorch 模型可视化工具pytorchviz
Github 中的pytorchviz
安装 pip install graphviz
pytorch使用tensorboard可视化参考教程
安装tensorboard
conda install tensorboard 或 pip install tensorboard
x = torch.rand([1, 3, 224, 640]) #.cuda()
net = Yolov4tinysegfusion(11, 3, 2)# .cuda()
with SummaryWriter(logdir='Yolov4tinysegfusion')as w: # tensorbnoard 模型可视化
w.add_graph(net,(x,))
打开 terminal 输入 tensorboard --logdir=“logdir路径”
TensorFlow installation not found - running with reduced feature set.
Serving TensorBoard on localhost; to expose to the network, use a proxy or pass --bind_all
TensorBoard 2.2.1 at http://localhost:6006/ (Press CTRL+C to quit) <== **复制该地址查看**
Tensorboard开源项目
pytorch使用netron 可视化模型网络
安装netron
在终端命令下输入pip install netron
下载好后,在终端下输入 netron,在浏览器上输入 loaclhost:8080 即可
还可以使用作者提供的在线Netron查看器,地址:
在线可视化方式
点击Open Model, 把保存的模型文件,即可看到对应的网络架构
import torch
from torch import nn
from torchviz import make_dot, make_dot_from_trace
model = nn.Sequential()
model.add_module('W0', nn.Linear(8, 16))
model.add_module('tanh', nn.Tanh())
model.add_module('W1', nn.Linear(16, 1))
torch.save(model, 'model.pth') # 保存模型,保存方式是torch.save 不是值保存参数的save(model.dict())
pytorch 神经网络模型参数量与Flops 统计工具:thop
PyTorch-OpCounter GitHub
OpCouter
PyTorch-OpCounter 的安装和使用都非常简单
安装thop, 不过 GitHub 上的代码总是最新的,因此也可以从 GitHub 上的脚本安装
pip install thop
能定制化统计规则,因此那些特殊的运算也能自定义地统计进去。
import torch
from torchvision import models
from thop import profile
model = models.densenet121()
input = torch.randn(1, 3, 224, 224)
flops, params = profile(model, inputs=(input, ))
Flops of DenseNet-121 is 2913996800.0
Parameters of DenseNet-121 is 7978856.0
pytorch中模型参数量 可用torchsummary summary 统计
import torch
from torchsummary import summary
from nets.yolo4_tiny import YoloBody
if __name__ == "__main__":
# 需要使用device来指定网络在GPU还是CPU运行
device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')
model = YoloBody(3,20).to(device)
summary(model, input_size=(3, 416, 416))
print:
----------------------------------------------------------------
Layer (type) Output Shape Param #
================================================================
Conv2d-1 [-1, 32, 208, 208] 864
BatchNorm2d-2 [-1, 32, 208, 208] 64
LeakyReLU-3 [-1, 32, 208, 208] 0
BasicConv-4 [-1, 32, 208, 208] 0
Conv2d-5 [-1, 64, 104, 104] 18,432
BatchNorm2d-6 [-1, 64, 104, 104] 128
LeakyReLU-7 [-1, 64, 104, 104] 0
BasicConv-8 [-1, 64, 104, 104] 0
Conv2d-9 [-1, 64, 104, 104] 36,864
BatchNorm2d-10 [-1, 64, 104, 104] 128
LeakyReLU-11 [-1, 64, 104, 104] 0
BasicConv-12 [-1, 64, 104, 104] 0
Conv2d-13 [-1, 32, 104, 104] 9,216
BatchNorm2d-14 [-1, 32, 104, 104] 64
LeakyReLU-15 [-1, 32, 104, 104] 0
BasicConv-16 [-1, 32, 104, 104] 0
Conv2d-17 [-1, 32, 104, 104] 9,216
BatchNorm2d-18 [-1, 32, 104, 104] 64
LeakyReLU-19 [-1, 32, 104, 104] 0
BasicConv-20 [-1, 32, 104, 104] 0
Conv2d-21 [-1, 64, 104, 104] 4,096
BatchNorm2d-22 [-1, 64, 104, 104] 128
LeakyReLU-23 [-1, 64, 104, 104] 0
BasicConv-24 [-1, 64, 104, 104] 0
MaxPool2d-25 [-1, 128, 52, 52] 0
Resblock_body-26 [[-1, 128, 52, 52], [-1, 64, 104, 104]] 0
Conv2d-27 [-1, 128, 52, 52] 147,456
BatchNorm2d-28 [-1, 128, 52, 52] 256
LeakyReLU-29 [-1, 128, 52, 52] 0
BasicConv-30 [-1, 128, 52, 52] 0
Conv2d-31 [-1, 64, 52, 52] 36,864
BatchNorm2d-32 [-1, 64, 52, 52] 128
LeakyReLU-33 [-1, 64, 52, 52] 0
BasicConv-34 [-1, 64, 52, 52] 0
Conv2d-35 [-1, 64, 52, 52] 36,864
BatchNorm2d-36 [-1, 64, 52, 52] 128
LeakyReLU-37 [-1, 64, 52, 52] 0
BasicConv-38 [-1, 64, 52, 52] 0
Conv2d-39 [-1, 128, 52, 52] 16,384
BatchNorm2d-40 [-1, 128, 52, 52] 256
LeakyReLU-41 [-1, 128, 52, 52] 0
BasicConv-42 [-1, 128, 52, 52] 0
MaxPool2d-43 [-1, 256, 26, 26] 0
Resblock_body-44 [[-1, 256, 26, 26], [-1, 128, 52, 52]] 0
Conv2d-45 [-1, 256, 26, 26] 589,824
BatchNorm2d-46 [-1, 256, 26, 26] 512
LeakyReLU-47 [-1, 256, 26, 26] 0
BasicConv-48 [-1, 256, 26, 26] 0
Conv2d-49 [-1, 128, 26, 26] 147,456
BatchNorm2d-50 [-1, 128, 26, 26] 256
LeakyReLU-51 [-1, 128, 26, 26] 0
BasicConv-52 [-1, 128, 26, 26] 0
Conv2d-53 [-1, 128, 26, 26] 147,456
BatchNorm2d-54 [-1, 128, 26, 26] 256
LeakyReLU-55 [-1, 128, 26, 26] 0
BasicConv-56 [-1, 128, 26, 26] 0
Conv2d-57 [-1, 256, 26, 26] 65,536
BatchNorm2d-58 [-1, 256, 26, 26] 512
LeakyReLU-59 [-1, 256, 26, 26] 0
BasicConv-60 [-1, 256, 26, 26] 0
MaxPool2d-61 [-1, 512, 13, 13] 0
Resblock_body-62 [[-1, 512, 13, 13], [-1, 256, 26, 26]] 0
Conv2d-63 [-1, 512, 13, 13] 2,359,296
BatchNorm2d-64 [-1, 512, 13, 13] 1,024
LeakyReLU-65 [-1, 512, 13, 13] 0
BasicConv-66 [-1, 512, 13, 13] 0
CSPDarkNet-67 [[-1, 256, 26, 26], [-1, 512, 13, 13]] 0
Conv2d-68 [-1, 256, 13, 13] 131,072
BatchNorm2d-69 [-1, 256, 13, 13] 512
LeakyReLU-70 [-1, 256, 13, 13] 0
BasicConv-71 [-1, 256, 13, 13] 0
Conv2d-72 [-1, 512, 13, 13] 1,179,648
BatchNorm2d-73 [-1, 512, 13, 13] 1,024
LeakyReLU-74 [-1, 512, 13, 13] 0
BasicConv-75 [-1, 512, 13, 13] 0
Conv2d-76 [-1, 75, 13, 13] 38,475
Conv2d-77 [-1, 128, 13, 13] 32,768
BatchNorm2d-78 [-1, 128, 13, 13] 256
LeakyReLU-79 [-1, 128, 13, 13] 0
BasicConv-80 [-1, 128, 13, 13] 0
Upsample-81 [-1, 128, 26, 26] 0
Upsample-82 [-1, 128, 26, 26] 0
Conv2d-83 [-1, 256, 26, 26] 884,736
BatchNorm2d-84 [-1, 256, 26, 26] 512
LeakyReLU-85 [-1, 256, 26, 26] 0
BasicConv-86 [-1, 256, 26, 26] 0
Conv2d-87 [-1, 75, 26, 26] 19,275
================================================================
Total params: 5,918,006
Trainable params: 5,918,006
Non-trainable params: 0
----------------------------------------------------------------
Input size (MB): 1.98
Forward/backward pass size (MB): 2513174.75
Params size (MB): 22.58
Estimated Total Size (MB): 2513199.31
----------------------------------------------------------------
其他可视化工具参考:
https://blog.csdn.net/dcrmg/article/details/103014890?utm_medium=distribute.pc_relevant_t0.none-task-blog-OPENSEARCH-1.edu_weight&depth_1-utm_source=distribute.pc_relevant_t0.none-task-blog-OPENSEARCH-1.edu_weight