深度学习的经典算法的论文、解读和代码实现

文章目录

  • CNN网络的经典算法
    • LeNet-5
    • AlexNet
    • VGG
    • Inception
      • Inception-v1(GoogLeNet)
      • BN-Inception
    • ResNet
    • R-CNN
      • R-CNN
      • Fast R-CNN
      • Faster R-CNN
    • YOLO
      • YOLO v1
      • YOLO v2
      • YOLO v3
      • YOLO v4
  • RNN的经典算法
    • RNN
    • GRU
    • LSTM
    • Encoder-Decoder
    • Attention
    • Transformer

CNN网络的经典算法

LeNet-5

  • 来源论文:LeCun, Yann, et al. “Gradient-based learning applied to document recognition.” Proceedings of the IEEE 86.11 (1998): 2278-2324.
  • 论文详解:CNN入门算法LeNet-5详解
  • 代码实现:https://github.com/TaavishThaman/LeNet-5-with-Keras

AlexNet

  • 来源论文:Krizhevsky, Alex, Ilya Sutskever, and Geoffrey E. Hinton. “Imagenet classification with deep convolutional neural networks.” Advances in neural information processing systems. 2012.
  • 论文详解:CNN经典算法AlexNet介绍
  • 代码实现:https://github.com/hjptriplebee/AlexNet_with_tensorflow

VGG

  • 来源论文:Simonyan, Karen, and Andrew Zisserman. “Very deep convolutional networks for large-scale image recognition.” arXiv preprint arXiv:1409.1556 (2014).
  • 论文详解:CNN经典算法VGGNet介绍
  • 代码和预训练资源:VGGNet预训练模型及代码资源

Inception

Inception-v1(GoogLeNet)

  • 来源论文:Szegedy, Christian, et al. “Going deeper with convolutions.” Proceedings of the IEEE conference on computer vision and pattern recognition. 2015.
  • 论文详解:CNN经典算法之Inception V1(GoogLeNet)
  • 代码和预训练资源:GoogLeNet代码资源(Tensorflow)

BN-Inception

  • 来源论文:Szegedy, Christian, et al. “Inception-v4, inception-resnet and the impact of residual connections on learning.” Proceedings of the AAAI conference on artificial intelligence. Vol. 31. No. 1. 2017.
  • 论文详解:CNN经典算法之BN-Inception
  • 代码和预训练资源:BN-Inception代码资源

ResNet

  • 来源论文:Targ, Sasha, Diogo Almeida, and Kevin Lyman. “Resnet in resnet: Generalizing residual architectures.” arXiv preprint arXiv:1603.08029 (2016).
  • 论文详解:ResNet论文详解
  • 代码实现:ResNet代码(超详细注释)+数据集下载地址

R-CNN

R-CNN

  • 来源论文:Girshick, Ross, et al. “Rich feature hierarchies for accurate object detection and semantic segmentation.” Proceedings of the IEEE conference on computer vision and pattern recognition. 2014.
  • 论文详解:R-CNN论文详解
  • 代码实现:R-CNN代码

Fast R-CNN

  • 来源论文:Girshick, Ross. “Fast r-cnn.” Proceedings of the IEEE international conference on computer vision. 2015.
  • 论文详解:Fast R-CNN论文详解
  • 代码实现:Fast R-CNN代码

Faster R-CNN

  • 来源论文:Ren, Shaoqing, et al. “Faster r-cnn: Towards real-time object detection with region proposal networks.” Advances in neural information processing systems 28 (2015).
  • 论文详解:一文读懂Faster RCNN
  • 代码实现:Faster R-CNN代码实现

YOLO

YOLO v1

  • 来源论文:YOLO v1论文
  • 论文详解:YOLO v1详解
  • 代码实现:YOLO v1代码实现

YOLO v2

  • 来源论文:YOLO v2论文
  • 论文详解:YOLO v2详解
  • 代码实现:YOLO v2代码实现

YOLO v3

  • 来源论文:YOLO v3论文
  • 论文详解:YOLO v3详解
  • 代码实现:YOLO v3代码

YOLO v4

  • 来源论文:YOLO v4论文
  • 论文详解:YOLO v4详解
  • 代码实现:YOLO v4代码

RNN的经典算法

RNN

  • 来源论文:Sherstinsky, Alex. “Fundamentals of recurrent neural network (RNN) and long short-term memory (LSTM) network.” Physica D: Nonlinear Phenomena 404 (2020): 132306.
  • 论文详解:通俗易懂的RNN
  • 代码实现:RNN代码

GRU

  • 来源论文:Dey, Rahul, and Fathi M. Salem. “Gate-variants of gated recurrent unit (GRU) neural networks.” 2017 IEEE 60th international midwest symposium on circuits and systems (MWSCAS). IEEE, 2017.
  • 论文详解:GRU(门控循环单元),易懂
  • 代码实现:GRU代码

LSTM

  • 来源论文:Huang, Zhiheng, Wei Xu, and Kai Yu. “Bidirectional LSTM-CRF models for sequence tagging.” arXiv preprint arXiv:1508.01991 (2015).
  • 论文详解:如何从RNN起步,一步一步通俗理解LSTM
  • 代码实现:LSTM代码实现

Encoder-Decoder

  • 来源论文:Badrinarayanan, Vijay, Alex Kendall, and Roberto Cipolla. “Segnet: A deep convolutional encoder-decoder architecture for image segmentation.” IEEE transactions on pattern analysis and machine intelligence 39.12 (2017): 2481-2495.
  • 论文详解:Encoder-Decoder 模型架构详解
  • 代码实现:Encoder-Decoder代码实现

Attention

  • 来源论文:Knudsen, Eric I. “Fundamental components of attention.” Annu. Rev. Neurosci. 30 (2007): 57-78.
  • 论文详解:注意力机制介绍(attention)
  • 代码实现:Attention代码

Transformer

  • 来源论文:Kitaev, Nikita, Łukasz Kaiser, and Anselm Levskaya. “Reformer: The efficient transformer.” arXiv preprint arXiv:2001.04451 (2020).
  • 论文详解:Transform详解
  • 代码实现:Transformer代码

你可能感兴趣的:(深度学习,算法,人工智能)