PyTorch笔记 - Seq2Seq + Attention 算法

自回归的序列建模,两种序列属于不同的空间。

两篇文章:

  1. Seq2Seq & Attention - Neural Machine Translation by Jointly Learning to Align and Translate
    1. Translate & Alignment
  2. Seq2Seq & Local Attention - Effective Approaches to Attention-based Neural Machine Translation
    1. content-based function,考虑内容和位置(编码器和解码器)
    2. location-based function,只考虑位置(解码器)
    3. Monotonic、Predictive、Gaussian distribution

Neural Machine Translation by Jointly Learning to Align and Translate

机器翻译:machine translation

MNT:Neutral Machine Translation,神经网络机器翻译

Encoder - Decoder,Stop Token

SOTA:English 2 French Translation,fixed-length vector -> soft-alignment

align and translate

Translate,

你可能感兴趣的:(深度学习,pytorch,算法,深度学习)