【论文整理】神经网络关系抽取必读论文

Must-read papers on NRE

NRE: Neural Relation Extraction.

Contributed by Tianyu Gao and Xu Han.

We released OpenNRE, an open-source framework for neural relation extraction. This repository provides several relation extraction methods and an easy-to-use training and testing framework.

Survey Papers

  1. A Survey of Deep Learning Methods for Relation Extraction.
    Shantanu Kumar.
    2017.
    paper

  2. Relation Extraction : A Survey.
    Sachin Pawara,b, Girish K. Palshikara, Pushpak Bhattacharyyab.
    2017.
    paper

Datasets

Supervised Datasets

  1. ACE 2005 Dataset link

  2. SemEval-2010 Task 8 Dataset link

Distantly Supervised Datasets

  1. NYT Dataset link

Few-shot Datasets

  1. FewRel: A Large-Scale Supervised Few-Shot Relation Classification Dataset with State-of-the-Art Evaluation
    Xu Han, Hao Zhu, Pengfei Yu, Ziyun Wang, Yuan Yao, Zhiyuan Liu, Maosong Sun
    EMNLP 2018.
    paper

    We present a Few-Shot Relation Classification Dataset (FewRel), consisting of 70,000 sentences on 100 relations derived from Wikipedia and annotated by crowdworkers.

Word Vector Tools

  1. Word2vec link

  2. GloVe link

Journal and Conference papers:

Supervised Datasets

  1. SemEval-2010 Task 8: Multi-Way Classification of Semantic Relations Between Pairs of Nominals.
    Iris Hendrickx , Su Nam Kim, Zornitsa Kozareva, Preslav Nakov, Diarmuid O ́ Se ́aghdha, Sebastian Pado ́, Marco Pennacchiotti, Lorenza Romano, Stan Szpakowicz.
    Workshop on Semantic Evaluations, ACL 2009
    paper

    This leads us to introduce a new task, which will be part of SemEval-2010: multi-way classification of mutually exclusive semantic relations between pairs of common nominals.

Distantly Supervised Datasets and Training Methods

  1. Learning to Extract Relations from the Web using Minimal Supervision.
    Razvan C. Bunescu, Department of Computer Sciences.
    ACL 2007.
    paper

    We present a new approach to relation extraction that requires only a handful of training examples. Given a few pairs of named entities known to exhibit or not exhibit a particular relation, bags of sentences containing the pairs are extracted from the web.

  2. Distant Supervision for Relation Extraction without Labeled Data.
    Mike Mintz, Steven Bills, Rion Snow, Dan Jurafsky.
    ACL-IJCNLP 2009.
    paper

    Our experiments use Freebase, a large semantic database of several thousand relations, to provide distant supervision.

  3. Modeling Relations and Their Mentions without Labeled Text.
    Sebastian Riedel, Limin Yao, Andrew McCallum.
    ECML 2010.
    paper

    We present a novel approach to distant supervision that can alleviate this problem based on the following two ideas: First, we use a factor graph to explicitly model the decision whether two entities are related, and the decision whether this relation is mentioned in a given sentence; second, we apply constraint-driven semi-supervision to train this model without any knowledge about which sentences express the relations in our training KB.

  4. Knowledge-Based Weak Supervision for Information Extraction of Overlapping Relations.
    Raphael Hoffmann, Congle Zhang, Xiao Ling, Luke Zettlemoyer, Daniel S. Weld.
    ACL-HLT 2011.
    paper

    This paper presents a novel approach for multi-instance learning with overlapping re- lations that combines a sentence-level extrac- tion model with a simple, corpus-level compo- nent for aggregating the individual facts.

Embeddings

  1. Distributed Representations of Words and Phrases and their Compositionality.
    Tomas Mikolov, Ilya Sutskever, Kai Chen, Greg Corrado, Jeffrey Dean.
    NIPS 2013.
    paper

    In this paper we present several extensions that improve both the quality of the vectors and the training speed. By subsampling of the frequent words we obtain significant speedup and also learn more regular word representations. We also describe a simple alternative to the hierarchical softmax called negative sampling.

  2. GloVe: Global Vectors for Word Representation.
    Jeffrey Pennington, Richard Socher, Christopher D. Manning.
    EMNLP 2014.
    paper

    The result is a new global log-bilinear regression model that combines the advantages of the two major model families in the literature: global matrix factorization and local context window methods.

Neural Encoders

  1. Semantic Compositionality through Recursive Matrix-Vector Spaces.
    Richard Socher, Brody Huval, Christopher D. Manning, Andrew Y. Ng.
    EMNLP-CoNLL 2012.
    paper

    We introduce a recursive neural network (RNN) model that learns compositional vector representations for phrases and sentences of arbitrary syntactic type and length.

  2. Convolution Neural Network for Relation Extraction
    Chunyang Liu, Wenbo Sun, Wenhan Chao, Wanxiang Che.
    ADMA 2013
    paper

    In this paper, we propose a novel convolution network, incorporating lexical features, applied to Relation Extraction.

  3. Relation Classification via Convolutional Deep Neural Network.
    Daojian Zeng, Kang Liu, Siwei Lai, Guangyou Zhou, Jun Zhao.
    COLING 2014.
    paper

    We exploit a convolutional deep neural network (DNN) to extract lexical and sentence level features. Our method takes all of the word tokens as input without complicated pre-processing.

  4. Classifying Relations by Ranking with Convolutional Neural Networks.
    C´ıcero Nogueira dos Santos, Bing Xiang, Bowen Zhou.
    ACL 2015.
    paper

    In this work we tackle the relation classification task using a convolutional neural network that performs classification by ranking (CR-CNN).

  5. Relation Extraction: Perspective from Convolutional Neural Networks.
    Thien Huu Nguyen, Ralph Grishman.
    NAACL-HLT 2015
    paper

    Our model takes advantages of multiple window sizes for filters and pre-trained word embeddings as an initializer on a non-static architecture to improve the performance. We emphasize the relation extraction problem with an unbalanced corpus.

  6. End-to-End Relation Extraction using LSTMs on Sequences and Tree Structures.
    Makoto Miwa, Mohit Bansal.
    ACL 2016.
    paper

    Our recurrent neural network based model captures both word sequence and dependency tree substructure information by stacking bidirectional tree-structured LSTM-RNNs on bidirectional sequential LSTM-RNNs… We further encourage detection of entities during training and use of entity information in relation extraction via entity pre-training and scheduled sampling.

  7. A Walk-based Model on Entity Graphs for Relation Extraction.
    Fenia Christopoulou, Makoto Miwa, Sophia Ananiadou.
    ACL 2018.
    paper

    We present a novel graph-based neural network model for relation extraction. Our model treats multiple pairs in a sentence simultaneously and considers interactions among them. All the entities in a sentence are placed as nodes in a fully-connected graph structure.

Denoising Methods

  1. Distant Supervision for Relation Extraction via Piecewise Convolutional Neural Networks.
    Daojian Zeng, Kang Liu, Yubo Chen, Jun Zhao.
    EMNLP 2015.
    paper

    We propose a novel model dubbed the Piecewise Convolutional Neural Networks (PCNNs) with multi-instance learning to address these two problems.

  2. Neural Relation Extraction with Selective Attention over Instances.
    Yankai Lin, Shiqi Shen, Zhiyuan Liu, Huanbo Luan, Maosong Sun.
    ACL 2016.
    paper

    Distant supervision inevitably accompanies with the wrong labelling problem, and these noisy data will substantially hurt the performance of relation extraction. To alleviate this issue, we propose a sentence-level attention-based model for relation extraction.

  3. Relation Extraction with Multi-instance Multi-label Convolutional Neural Networks.
    Xiaotian Jiang, Quan Wang, Peng Li, Bin Wang.
    COLING 2016.
    paper

    In this paper, we propose a multi-instance multi-label convolutional neural network for distantly supervised RE. It first relaxes the expressed-at-least-once assumption, and employs cross-sentence max-pooling so as to enable information sharing across different sentences.

  4. Adversarial Training for Relation Extraction.
    Yi Wu, David Bamman, Stuart Russell.
    EMNLP 2017.
    paper

    Adversarial training is a mean of regularizing classification algorithms by generating adversarial noise to the training data. We apply adversarial training in relation extraction within the multi-instance multi-label learning framework.

  5. A Soft-label Method for Noise-tolerant Distantly Supervised Relation Extraction.
    Tianyu Liu, Kexiang Wang, Baobao Chang, Zhifang Sui.
    EMNLP 2017.
    paper

    We introduce an entity-pair level denoise method which exploits semantic information from correctly labeled entity pairs to correct wrong labels dynamically during training.

  6. DSGAN: Generative Adversarial Training for Distant Supervision Relation Extraction.
    Pengda Qin, Weiran Xu, William Yang Wang.
    paper

    We introduce an adversarial learning framework, which we named DSGAN, to learn a sentence-level true-positive generator. Inspired by Generative Adversarial Networks, we regard the positive samples generated by the generator as the negative samples to train the discriminator.

  7. Reinforcement Learning for Relation Classification from Noisy Data.
    Jun Feng, Minlie Huang, Li Zhao, Yang Yang, Xiaoyan Zhu.
    AAAI 2018.
    paper

    We propose a novel model for relation classification at the sentence level from noisy data. The model has two modules: an instance selector and a relation classifier. The instance selector chooses high-quality sentences with reinforcement learning and feeds the selected sentences into the relation classifier, and the relation classifier makes sentence-level prediction and provides rewards to the instance selector.

  8. Robust Distant Supervision Relation Extraction via Deep Reinforcement Learning.
    Pengda Qin, Weiran Xu, William Yang Wang.
    2018.
    paper

    We explore a deep reinforcement learning strategy to generate the false-positive indicator, where we automatically recognize false positives for each relation type without any supervised information.

Extensions

  1. Neural Knowledge Acquisition via Mutual Attention between Knowledge Graph and Text
    Xu Han, Zhiyuan Liu, Maosong Sun.
    AAAI 2018.
    paper

    We propose a general joint representation learning framework for knowledge acquisition (KA) on two tasks, knowledge graph completion (KGC) and relation extraction (RE) from text. We propose an effective mutual attention between KGs and text. The recip- rocal attention mechanism enables us to highlight important features and perform better KGC and RE.

  2. Hierarchical Relation Extraction with Coarse-to-Fine Grained Attention
    Xu Han, Pengfei Yu, Zhiyuan Liu, Maosong Sun, Peng Li.
    EMNLP 2018.
    paper

    We aim to incorporate the hierarchical information of relations for distantly supervised relation extraction and propose a novel hierarchical attention scheme. The multiple layers of our hierarchical attention scheme provide coarse-to-fine granularity to better identify valid instances, which is especially effective for extracting those long-tail relations.

  3. Incorporating Relation Paths in Neural Relation Extraction
    Wenyuan Zeng, Yankai Lin, Zhiyuan Liu, Maosong Sun.
    EMNLP 2017.
    paper

    We build inference chains between two target entities via intermediate entities, and propose a path-based neural relation extraction model to encode the relational semantics from both direct sentences and inference chains.

  4. RESIDE: Improving Distantly-Supervised Neural Relation Extractionusing Side Information
    Shikhar Vashishth, Rishabh Joshi, Sai Suman Prayaga, Chiranjib Bhattacharyya, Partha Talukdar.
    EMNLP 2018.
    paper

    In this paper, we propose RESIDE, a distantly-supervised neural relation extraction method which utilizes additional side information from KBs for improved relation extraction. It uses entity type and relation alias information for imposing soft constraints while predicting relations.

你可能感兴趣的:(自然语言处理,深度学习,relation,extraction)