nlp中的attention、lstm、transformer、bert学习

1、Attention机制:

参考文献

完全图解RNN、RNN变体、Seq2Seq、Attention机制  https://www.leiphone.com/news/201709/8tDpwklrKubaecTa.html

Attention注意力机制介绍  https://www.cnblogs.com/hiyoung/p/9860561.html

两篇结合看,理解的快

2、lstm

https://www.cnblogs.com/zuotongbin/p/10698843.html

3、transformer

https://www.jianshu.com/p/e7d8caa13b21

4、bert

https://github.com/google-research/bert

https://www.cnblogs.com/rucwxb/p/10277217.html

https://zhuanlan.zhihu.com/p/46652512

https://www.jianshu.com/p/60fc9253a0bf

5、抽时间看看的

https://zhuanlan.zhihu.com/p/50443871

你可能感兴趣的:(nlp,nlp,lstm)