https://www.imooc.com/article/23821
https://www.cnblogs.com/pinard/p/6509630.html
通俗理解注意力机制http://www.360doc.com/content/18/0506/06/36490684_751494854.shtml
Seq2Seq:http://blog.itpub.net/31545819/viewspace-2564383/ https://zhuanlan.zhihu.com/p/40920384
http://bitjoy.net/2019/08/02/cs224n%EF%BC%881-31%EF%BC%89translation-seq2seq-attention/(包含了机器翻译的评价)
自然语言处理基础技术之分词、向量化、词性标注:https://www.cnblogs.com/qcloud1001/p/7903008.html
LSTM的代码解释:http://wemedia.ifeng.com/28405503/wemedia.shtml
NLP整套流程:https://www.cnblogs.com/zongfa/p/9563510.html https://blog.51cto.com/dengshuangfu/2354806
搭建多层双向lstm网络:https://www.e-learn.cn/content/qita/1940122
seq2seq 模型实现聊天机器人:http://www.easyapple.net/?p=1384(装B用)
基于TensorFlow框架的Seq2Seq英法机器翻译模型(seq2seq 模型训练细节):
https://baijiahao.baidu.com/s?id=1601215537638531271&wfr=spider&for=pc
图解Word2vec:https://cloud.tencent.com/developer/article/1418647
用tensorflow框架搭建基于seq2seq-attention的聊天机器人:https://www.cnblogs.com/wf-ml/p/10967042.html
tf.clip_by_global_norm的介绍: https://www.cnblogs.com/baochen/p/8992841.html
一个dataset中元素被读取完了=>tf.errors.OutOfRangeError:https://www.cnblogs.com/hellcat/p/8569651.html
Tensorflow的数据处理中的Dataset和Iterator:https://www.cnblogs.com/huangyc/p/10339433.html
逻辑回归:http://www.360doc.com/content/18/0813/09/11935121_777859274.shtml
Transformer-XL: http://m.sohu.com/a/297688215_99965580