RAG:Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks 论文阅读
2020NeuralPS文章地址:https://arxiv.org/abs/2005.11401源码地址:GitHub-huggingface/transformers:Transformers:State-of-the-artMachineLearningforPytorch,TensorFlow,andJAX.-142RAG目录0、背景1、摘要2、导言3、结论4、模型5、实验6、与REALM