recurrent neural network regularization


本文介绍了lstm的regularization :dropout.

lstm:


recurrent neural network regularization_第1张图片


recurrent neural network regularization_第2张图片

符号意思:




对lstm进行rugularization,第一需要达到regularization的效果,第二不能丧失lstm的记忆能力。所以只在同一时刻layer之间进行dropout。从t-1到t之间没有dropout。


recurrent neural network regularization_第3张图片

对应tensorflow api:

input_keep_prob在传入当前cell之前的droupout,

output_keep_prob,当前cell的输出做dropout

可以看到还提供了,state_keep_prob

recurrent neural network regularization_第4张图片

参考文献:http://colah.github.io/posts/2015-08-Understanding-LSTMs/

你可能感兴趣的:(recurrent neural network regularization)