Tensorflow-tf.nn.dynamic_rnn函数解析

outputs, states = tf.nn.dynamic_rnn()

来看一下BasicLSTMCell的call函数定义:
new_c = c * sigmoid(f + self._forget_bias) + sigmoid(i) * self._activation(j) )
new_h = self._activation(new_c)* sigmoid(o)
new_state= LSTMStateTuple(new_c,new_h)
return new_h, new_state

返回的隐状态是new_c和new_h的组合,而output就是单独的new_h。如果我们处理的是分类问题,那么我们还需要对new_h添加单独的Softmax层才能得到最后的分类概率输出。

new_c:细胞状态对应图1C、图2s
new_h:隐藏状态对应图1h、图2v

Tensorflow-tf.nn.dynamic_rnn函数解析_第1张图片
图1(http://colah.github.io/posts/2015-08-Understanding-LSTMs/)
Tensorflow-tf.nn.dynamic_rnn函数解析_第2张图片
图2( A Critical Review of Recurrent Neural Networksfor Sequence Learning 

你可能感兴趣的:(Tensorflow-tf.nn.dynamic_rnn函数解析)