[论文阅读 --- Bert + Knowledge] 融入知识图谱的预训练模型总结

一. 模型汇总

1. ERNIE: Enhanced Language Representation with Informative Entities

2. K-BERT: Enabling Language Representation with Knowledge Graph

3. KEPLER: A Unified Model for Knowledge Embedding and Pre-trained Language Representation

4. CoLAKE: Contextualized Language and Knowledge Embedding

5. Exploiting Structured Knowledge in Text via Graph-Guided Representation Learning

6. K-ADAPTER: Infusing Knowledge into Pre-Trained Models with Adapters

7. Integrating Graph Contextualized Knowledge into Pre-trained Language Models

8. JAKET: Joint Pre-training of Knowledge Graph and Language Understanding

9. KG-BERT: BERT for Knowledge Graph Completion

10. Knowledge-Enriched Transformer for Emotion Detection in Textual Conversations

11. Barack's Wife Hillary: Using Knowledge-Graphs for Fact-Aware Language Modeling

12. Are Pretrained Language Models Symbolic Reasoners Over Knowledge

13. How Much Knowledge Can You Pack Into the Parameters of a Language Model

你可能感兴趣的:(论文阅读笔记)