《Gradient-based Learning Applied to Document Recognition》
论文:http://lushuangning.oss-cn-beijing.aliyuncs.com/CNN%E5%AD%A6%E4%B9%A0%E7%B3%BB%E5%88%97/Gradient-Based_Learning_Applied_to_Document_Recognition.pdf
卷积神经网络的开山之作
《ImageNet Classification with Deep Convolutional Neural Networks》
论文:https://dl.acm.org/doi/10.1145/3065386
首次将卷积神经网络和深度学习应用于大型图像识别,提出来dropout层,为后来的BN层提供了灵感
《Rich feature hierarchies for accurate object detection and semantic segmentation》
首次将目标检测和卷积神经网络结合起来,并应用到工业级别。
R-CNN、fast-CNN、faster-CNN
将目标的分割和识别合二为一。
https://zhuanlan.zhihu.com/p/64694855
https://blog.csdn.net/v1_vivian/article/details/78599229
《Spatial Pyramid Pooling in Deep Convolutional Networks for Visual Recognition》
论文:https://arxiv.org/abs/1406.4729
借鉴了特征金字塔
V1:《Going deeper with convolutions》
V2:《Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift》
V3:《Rethinking the Inception Architecture for Computer Vision》(2015年)
V1:https://link.csdn.net/?target=https%3A%2F%2Fieeexplore.ieee.org%2Fdocument%2F7298594
V2:https://arxiv.org/pdf/1502.03167.pdf
V3:https://arxiv.org/pdf/1512.00567.pdf
更深的网络,直接解决计算和梯度递减问题
GoogLeNet V2:《Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift》
论文:https://arxiv.org/pdf/1502.03167.pdf
后来的网络逐渐开始使用BN层
《Very Deep Convolutional Networks for Large-Scale Image Recognition》
论文:[https://arxiv.org/abs/1409.1556](https://arxiv.org/abs/1409.1556)
模型:https://worksheets.codalab.org/worksheets/0xe2ac460eee7443438d5ab9f43824a819
使用了更深的网络,提出来预训练和权重初始化的重要性。开启了3*3的卷积时代,大大减少了参数量
《Deep Residual Learning for Image Recognition》
论文:https://link.zhihu.com/?target=https%3A//arxiv.org/pdf/1512.03385.pdf
《Xception: Deep Learning with Depthwise Separable Convolutions》
论文:https://arxiv.org/abs/1610.02357
超越了ResNet和InceptionV3
《Inception-v4, Inception-ResNet and the Impact of Residual Connections on Learning》
论文:https://arxiv.org/pdf/1602.07261.pdf
《Identity Mappings in Deep Residual Networks》
论文:https://arxiv.org/abs/1603.05027
对ResNet的改进
《Densely Connected Convolutional Networks》
论文:https://link.csdn.net/?target=https%3A%2F%2Fopenaccess.thecvf.com%2Fcontent_cvpr_2017%2Fhtml%2FHuang_Densely_Connected_Convolutional_CVPR_2017_paper.html
模型:https://link.zhihu.com/?target=https%3A//github.com/liuzhuang13/DenseNet
《Aggregated Residual Transformations for Deep Neural Networks》
论文:https://arxiv.org/pdf/1611.05431.pdf
模型:https://github.com/facebookresearch/ResNeXt
《MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications》
论文:https://arxiv.org/abs/1704.04861
注意有V1,V2,V3
《Learning Transferable Architectures for Scalable Image Recognition》
论文:https://arxiv.org/abs/1707.07012
《Squeeze-and-Excitation Networks》
论文:https://arxiv.org/abs/1709.01507
模型:https://github.com/hujie-frank/SENet
引入注意力机制的卷积神经网络
《ShuffleNet: An Extremely Efficient Convolutional Neural Network for Mobile Devices》
论文:https://arxiv.org/abs/1707.01083#:~:text=We%20introduce%20an%20extremely%20computation-efficient%20CNN%20architecture%20named,to%20greatly%20reduce%20computation%20cost%20while%20maintaining%20accuracy.
V1,V2版本
《Bag of Tricks for Image Classification with Convolutional Neural Networks》
论文:https://arxiv.org/abs/1812.01187
一些分类的常用技巧
《EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks》
论文:https://arxiv.org/abs/1905.11946v5
《ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks》
论文:https://arxiv.org/abs/1910.03151
注意力机制
《Designing Network Design Spaces》
论文:https://arxiv.org/abs/2003.13678
《GhostNet: More Features from Cheap Operations》
论文:https://arxiv.org/abs/1911.11907
https://blog.csdn.net/fendouaini/article/details/109280085
《EfficientNetV2: Smaller Models and Faster Training》
论文:https://arxiv.org/abs/2104.00298
《Revisiting ResNets: Improved Training and Scaling Strategies》
论文:https://arxiv.org/pdf/2103.07579.pdf
有一些数据处理的论文。
https://zhuanlan.zhihu.com/p/354936159
《A ConvNet for the 2020s》
论文:https://arxiv.org/abs/2201.03545
图像处理
https://zhuanlan.zhihu.com/p/478286484
遇到没见过的持续更新。。。