一些attention的tensorflow实现 和使用方法

https://github.com/allenai/document-qa/blob/master/docqa/nn/attention.py
使用方法:

self.outputs, _ = rnn.bidirectional_dynamic_rnn(
            lstm_cell_fw,
            lstm_cell_bw,
            self.input_emb,
            dtype=tf.float32,
            sequence_length=self.sequence_len
        )

self_attention = StaticAttentionSelf(attention=DotProduct())
attn_output = self_attention.apply(None,self.outputs[0],x_mask=self.sequence_len)

你可能感兴趣的:(TensorFlow)