bert资料收集汇总

一、seq2seq资料

  • Teacher force策略

https://github.com/tensorflow/nmt

https://zhuanlan.zhihu.com/p/57155059

  • Seq2seq应用

Latex识别:https://guillaumegenthial.github.io/sequence-to-sequence.html

字母序:https://zhuanlan.zhihu.com/p/27608348

  • API使用方法

https://zhuanlan.zhihu.com/p/47929039

二、bert模型的理解

  • FFN的理解

https://blog.csdn.net/u013166817/article/details/85837124

  • The Illustrated Transformer

https://blog.csdn.net/yujianmin1990/article/details/85221271

  • Google BERT详解

https://zhuanlan.zhihu.com/p/46652512

  • 文本摘要

text-summarization-tensorflow

https://blog.csdn.net/rockingdingo/article/details/55224282

三、自然语言处理可视化

  • Visualizing A Neural Machine Translation Model (Mechanics of Seq2seq Models With Attention)

https://jalammar.github.io/visualizing-neural-machine-translation-mechanics-of-seq2seq-models-with-attention/

  • The Illustrated Transformer

https://jalammar.github.io/illustrated-transformer/

  • The Illustrated BERT, ELMo, and co. (How NLP Cracked Transfer Learning)

https://jalammar.github.io/illustrated-bert/

你可能感兴趣的:(awesome)