Linguistically-Informed Self-Attention for Semantic Role Labeling 论文笔记

  1. jointly predict parts of speech and predicates
    • parts of speech 词性标注
    • predicates 谓语标注,是Semantic Role Labeling的一个子任务,把句子中的谓词标注出来,用于后续SRL。
  2. perform parsing and attend to syntactic parse parents
    • Syntactically-informed self-attention,源自于论文[2],进行 dependency parsing。需要注意,这一层是有监督信号的,根据attention A_parse 计算每一个token的head,同时利用Q_parse 和 K_parse 计算dependency label。
  3. assigning semantic role labels
    • 根据给定的predicate,预测每个token与predicate的关系。

reference:

  1. Linguistically-Informed Self-Attention for Semantic Role Labeling
  2. Deep Biaffine Attention for Neural Dependency Parsing
  3. Deep Semantic Role Labeling: What Works and What’s Next
  4. http://www.hankcs.com/nlp/parsing/deep-biaffine-attention-for-neural-dependency-parsing.html
  5. https://blog.csdn.net/mingzai624/article/details/78061506

你可能感兴趣的:(Linguistically-Informed Self-Attention for Semantic Role Labeling 论文笔记)