《Targeted Aspect-Based Sentiment Analysis via Embedding Commonsense Knowledge into an Attentive LSTM》阅读笔记

论文链接:http://www.sentic.net/sentic-lstm.pdf

会议:2018 AAAI

文章对Attention机制的解释:Such mechanism takes an external memory and representations of a sequence as input and produces a probability distribution quantifying the concerns in each position of the sequence.


《Targeted Aspect-Based Sentiment Analysis via Embedding Commonsense Knowledge into an Attentive LSTM》阅读笔记_第1张图片
笔记


《Targeted Aspect-Based Sentiment Analysis via Embedding Commonsense Knowledge into an Attentive LSTM》阅读笔记_第2张图片
笔记


本文的方法:

《Targeted Aspect-Based Sentiment Analysis via Embedding Commonsense Knowledge into an Attentive LSTM》阅读笔记_第3张图片
架构图

先将句子送到双向LSTM中,接着是attention component,target级别的attention的输出作为target-level的representation,接着这个target representation和aspect embedding一起计算句子级别的attention,将整个句子转换为一个vecotr。

1. Target-level Attention

我的理解。。。原文中没有说H’ 是怎么计算的,感觉是还有另一个LSTM去计算target的隐状态

Self-attention:

《Targeted Aspect-Based Sentiment Analysis via Embedding Commonsense Knowledge into an Attentive LSTM》阅读笔记_第4张图片
Target-level attention

2. Sentence-level Attention Model

Sentence-level Attention

3. Commonsense Knowledge

常识知识,使用SenticNet

4. Sentic LSTM

《Targeted Aspect-Based Sentiment Analysis via Embedding Commonsense Knowledge into an Attentive LSTM》阅读笔记_第5张图片
增加SenticNet的信息

我的理解:本文关注一个新的问题,就是target和aspect级别的情感分析。提出了层次化的attention去做分类,并且加入了SenticNet的一些外部信息,增强了LSTM的结构。

你可能感兴趣的:(《Targeted Aspect-Based Sentiment Analysis via Embedding Commonsense Knowledge into an Attentive LSTM》阅读笔记)