转载:
https://blog.csdn.net/oBrightLamp/article/details/85067981#RNN_138
文章目录
前言
本书内容
资料推荐
开源许可 LICENSE
软件版本
损失函数
MSELoss
cross-entropy
softmax
softmax + cross-entropy
优化算法
正则化 / 参数规范惩罚
SGD, Momentum, RMSprop, Adam
Batch Normalization
特征工程
PCA
全连接神经网络
Affine
卷积神经网络
convolution
Transpose Convolution
MaxPool
ReLU
dropout
循环神经网络
RNN
LSTM
GRU
生成对抗网络
GAN
实战
Kaggle
附录
前言
本书内容
内容提要
– https://blog.csdn.net/oBrightLamp/article/details/85162129
资料推荐
自学深度学习之计算机视觉的入门资料推荐
– https://blog.csdn.net/oBrightLamp/article/details/84076410
开源许可 LICENSE
所有的说明性文档基于 Creative Commons 协议, 所有的代码基于 MIT 协议.
All documents are licensed under the Creative Commons License, all codes are licensed under the MIT License.
软件版本
Python = 3.6
scikit-learn = 0.20.0
TensorFlow = 1.12
PyTorch = 1.0
损失函数
MSELoss
均方差损失函数MSELoss详解及反向传播中的梯度求导
– https://blog.csdn.net/oBrightLamp/article/details/85137756
cross-entropy
通过函数图像介绍信息熵的概念
– https://blog.csdn.net/oBrightLamp/article/details/85269091
案例详解cross-entropy交叉熵损失函数及反向传播
– https://blog.csdn.net/oBrightLamp/article/details/83962147
Python和PyTorch对比实现cross-entropy交叉熵损失函数及反向传播
– https://blog.csdn.net/oBrightLamp/article/details/84029058
softmax
softmax函数详解及误差反向传播的梯度求导
– https://blog.csdn.net/oBrightLamp/article/details/83959185
纯Python和PyTorch对比实现softmax及其反向传播
– https://blog.csdn.net/oBrightLamp/article/details/84034658
softmax + cross-entropy
多标签softmax + cross-entropy交叉熵损失函数详解及反向传播中的梯度求导
– https://blog.csdn.net/oBrightLamp/article/details/84069835
Python和PyTorch对比实现多标签softmax + cross-entropy交叉熵损失及反向传播
– https://blog.csdn.net/oBrightLamp/article/details/84073485
优化算法
正则化 / 参数规范惩罚
L2正则化Regularization详解及反向传播的梯度求导
– https://blog.csdn.net/oBrightLamp/article/details/85290929
SGD, Momentum, RMSprop, Adam
常用梯度下降算法SGD, Momentum, RMSprop, Adam详解
– https://blog.csdn.net/oBrightLamp/article/details/85218783
纯Python和PyTorch对比实现SGD, Momentum, RMSprop, Adam梯度下降算法
– https://blog.csdn.net/oBrightLamp/article/details/85218799
Batch Normalization
Batch Normalization函数详解及反向传播中的梯度求导
– https://blog.csdn.net/oBrightLamp/article/details/84332455
Python和PyTorch对比实现批标准化Batch Normalization函数及反向传播
– https://blog.csdn.net/oBrightLamp/article/details/84557854
Batch Normalization的测试或推理过程及样本参数更新方法
– https://blog.csdn.net/oBrightLamp/article/details/85391056
Python和PyTorch对比实现批标准化 Batch Normalization 函数在测试或推理过程中的算法
– https://blog.csdn.net/oBrightLamp/article/details/85391167
特征工程
PCA
特征工程PCA降维方法的最大方差理论详解
– https://blog.csdn.net/oBrightLamp/article/details/85255895
纯Python和scikit-learn对比实现PCA特征降维
– https://blog.csdn.net/oBrightLamp/article/details/85255898
全连接神经网络
Affine
affine/linear(仿射/线性)变换函数详解及全连接层反向传播的梯度求导
– https://blog.csdn.net/oBrightLamp/article/details/84333111
Python和PyTorch对比实现affine/linear(仿射/线性)变换函数及全连接层的反向传播
– https://blog.csdn.net/oBrightLamp/article/details/84453996
卷积神经网络
convolution
卷积convolution函数详解及反向传播中的梯度求导
– https://blog.csdn.net/oBrightLamp/article/details/84561088
Python和PyTorch对比实现卷积convolution函数及反向传播
– https://blog.csdn.net/oBrightLamp/article/details/84589545
卷积convolution函数的矩阵化计算方法及其梯度的反向传播
– https://blog.csdn.net/oBrightLamp/article/details/85870773
Python 实现 TensorFlow 和 PyTorch 验证卷积 convolution 函数矩阵化计算及反向传播
– https://blog.csdn.net/oBrightLamp/article/details/85870813
Transpose Convolution
TensorFlow和PyTorch对比理解卷积和反向卷积或转置卷积(Transpose Convolution)
– https://blog.csdn.net/oBrightLamp/article/details/85708124
MaxPool
池化层MaxPool函数详解及反向传播的公式推导
– https://blog.csdn.net/oBrightLamp/article/details/84635346
Python和PyTorch对比实现池化层MaxPool函数及反向传播
– https://blog.csdn.net/oBrightLamp/article/details/84635308
ReLU
ReLU函数详解及反向传播中的梯度求导
– https://blog.csdn.net/oBrightLamp/article/details/84326978
Python和PyTorch对比实现ReLU函数及反向传播
– https://blog.csdn.net/oBrightLamp/article/details/84326804
dropout
dropout函数详解及反向传播中的梯度求导
– https://blog.csdn.net/oBrightLamp/article/details/84105097
Python和PyTorch对比实现dropout函数及反向传播
– https://blog.csdn.net/oBrightLamp/article/details/84326091
循环神经网络
RNN
循环神经网络RNNCell单元详解及反向传播的梯度求导
– https://blog.csdn.net/oBrightLamp/article/details/85015325
纯Python和PyTorch对比实现循环神经网络RNNCell及反向传播
– https://blog.csdn.net/oBrightLamp/article/details/85015402
纯Python和PyTorch对比实现循环神经网络RNN及反向传播
– https://blog.csdn.net/oBrightLamp/article/details/85015387
LSTM
长短期记忆网络LSTMCell单元详解及反向传播的梯度求导
– https://blog.csdn.net/oBrightLamp/article/details/85068285
纯Python和PyTorch对比实现循环神经网络LSTM及反向传播
– https://blog.csdn.net/oBrightLamp/article/details/85069255
GRU
门控循环单元GRUCell详解及反向传播的梯度求导
– https://blog.csdn.net/oBrightLamp/article/details/85109589
纯Python和PyTorch对比实现门控循环单元GRU及反向传播
– https://blog.csdn.net/oBrightLamp/article/details/85109607
生成对抗网络
GAN
生成对抗网络 GAN 的数学原理
– https://blog.csdn.net/oBrightLamp/article/details/86553074
实战
Kaggle
PyTorch Kaggle 快速上手(杂草幼苗图片识别)
– https://blog.csdn.net/oBrightLamp/article/details/84947499
附录
常用Latex公式
– https://blog.csdn.net/oBrightLamp/article/details/83964331
---------------------
作者:BrightLampCsdn
来源:CSDN
原文:https://blog.csdn.net/oBrightLamp/article/details/85067981
版权声明:本文为博主原创文章,转载请附上博文链接!