[论文阅读 --- 常识推理] Bert + Commonsense Reasoning: 预训练模型与常识推理相结合的研究进展

一. 模型总结

1. CommonsenseQA

(1) KagNet: Knowledge-Aware Graph Networks for Commonsense Reasoning

(2) Graph-Based Reasoning over Heterogeneous External Knowledge for Commonsense Question Answering

(3) Improving Commonsense Question Answering by Graph-based Iterative Retrieval over Multiple Knowledge Sources

(4) Fusing Context Into Knowledge Graph for Commonsense Reasoning

(5) I Know What You Asked: Graph Path Learning using AMR for Commonsense Reasoning

(6) Does BERT Solve Commonsense Task via Commonsense Knowledge?

(7) Scalable Multi-Hop Relational Reasoning for Knowledge-Aware Question Answering

(8) Generative Data Augmentation for Commonsense Reasoning

(9) Pre-training Is (Almost) All You Need: An Application to Commonsense Reasoning

(10) FreeLB: Enhanced Adversarial Training for Natural Language Understanding

(11) Commonsense Knowledge + BERT for Level 2 Reading Comprehension Ability Test

(12) Align, Mask and Select: A Simple Method for Incorporating Commonsense Knowledge into Language Representation Models

(13) Explain Yourself! Leveraging Language Models for Commonsense Reasoning

(14) Connecting the Dots: A Knowledgeable Path Generator for Commonsense Question Answering

 

2. CosmosQA

(1) REM-Net: Recursive Erasure Memory Network for Commonsense Evidence Refinement

(2) Commonsense Evidence Generation and Injection in Reading Comprehension

 

3. VCR

(1) KVL-BERT: Knowledge Enhanced Visual-and-Linguistic BERT for Visual Commonsense Reasoning

 

 

你可能感兴趣的:(论文阅读笔记)