如何在LSTM/RNN中加入Attention机制

Pytorch-LSTM+Attention文本分类
https://blog.csdn.net/qq_34838643/article/details/110200332

What is attention mechanism?
https://towardsdatascience.com/what-is-attention-mechanism-can-i-have-your-attention-please-3333637f2eac

Adding A Custom Attention Layer To Recurrent Neural Network In Keras
https://machinelearningmastery.com/adding-a-custom-attention-layer-to-recurrent-neural-network-in-keras/

相关问题
Why the performance of LSTM decreases after the addition of attention using pytorch?

你可能感兴趣的:(机器学习,lstm,rnn,深度学习,attention,注意力机制)