Question Answering: Chen Danqi Stanford

Question Answering

主要贡献:
1. Model: 
SAR(Stanford Attentive Reader) ,Chap02 thesis
2. System: facebookresearch / DrQA
https://github.com/facebookresearch/DrQA

=================================================

1.SQuAD: 100,000+ Questions for Machine Comprehension of Text ,2016
   https://arxiv.org/pdf/1606.05250.pdf

2.Bidirectional Attention Flow for Machine Comprehension  ,2018
  https://arxiv.org/pdf/1611.01603.pdf
【中】 https://zhuanlan.zhihu.com/p/53626872  
【中】 https://blog.csdn.net/mottled233/article/details/104409697

3.Reading Wikipedia to Answer Open-Domain Questions ,2017
   https://arxiv.org/pdf/1704.00051.pdf
【中】https://blog.csdn.net/qq_28385535/article/details/105761817
【中】https://blog.csdn.net/shark803/article/details/96582622
https://jozeelin.github.io/2019/08/13/drqa/
https://zhuanlan.zhihu.com/p/93078867

4.Latent Retrieval for Weakly Supervised Open Domain Question Answering  ,ACL 2019,Goolge
   https://arxiv.org/pdf/1906.00300.pdf
【中】https://zhuanlan.zhihu.com/p/93580777

5.Dense Passage Retrieval for Open-Domain Question Answering  ,2020
   https://arxiv.org/pdf/2004.04906.pdf
  【中】https://blog.csdn.net/c9Yv2cf9I06K2A9E/article/details/106435207
在检索之前先用一个 dense encoder 给文档库中的所有文档都进行 encoding。在检索的时候用另一个
dense encoder 给 question 进行 encoding,之后根据下图公式算两个 representation 的 similarity,
取 top-k 作为结果。
作者用的 encoder 是 bert-base-uncased,然后拿 [CLS] 的 vector 作为 representation。由于文档库
可能会很大,所以作者用了 facebbook FAISS来索引 encode 之后的向量。

 

5.a Leveraging Passage Retrieval with Generative Models for Open Domain Question Answering,2020
https://arxiv.org/abs/2007.01282
5.b How Much Knowledge Can You Pack Into the Parameters of a Language Model,2020
https://arxiv.org/abs/2002.08910
6.1 Real-Time Open-Domain Question Answering with Dense-Sparse Phrase Index,2019
 https://arxiv.org/abs/1906.05807
6.Learning Dense Representations of Phrases at Scale ,2021
   https://arxiv.org/pdf/2012.12624.pdf
It is possible to encode all the phrases (60 billion phrases in Wikipedia) using dense vectors
and only do nearest neighbor search without a BERT model at inference time!

=================================================
Neural Reading Comprehension and Beyond ,2018,
https://www.cs.princeton.edu/~danqic/papers/thesis.pdf
https://github.com/danqi/thesis
=================================================
ACL2020 Tutorial: Open-Domain Question Answering
https://github.com/danqi/acl2020-openqa-tutorial
=

中译:
https://blog.csdn.net/Magical_Bubble/article/details/89488722
https://blog.csdn.net/mottled233/article/details/102995776
https://blog.csdn.net/cindy_1102/article/details/88714390
 

你可能感兴趣的:(01.NLP,05.QA)