[ICLR2015]Neural Machine Translation by Jointly Learning to Align and Translate

因为《Incorporating Copying Mechanism in Sequence-to-Sequence Learning》 和《Pointing the Unknown Words 》两篇文章都引用了本文,所以就看一下,
就是用attention做对齐然后对其翻译,这个过程网络上在介绍attention model都是用机器翻译为例进行讲解,也就是说都是以这篇文章来讲解attention model, 针对本文,attention model 的理解是最重要的,这里给出讲解AM很好的博客,不再赘述。
http://blog.csdn.net/xbinworld/article/details/54607525

你可能感兴趣的:(PaperNotes,Paper)