自然语言处理——Seq2Seq_01

1.Padding:

First, we need to convert the variable length into fixed length sequence by padding.

Take the following query-response as the example.

 Q: How are you?

 A: I'm fine.

After the padding, it becomes:



2.Bucketing:

I don't want the length to be fixed, so, I consider the list of  [ (5,10), (10,15), (20,25), (40,50) ].

take the same sentence as the example, it becomes :


3. word embedding:

In the Seq2Seq model, the weights of the embedding layer are jointly trained with other parameters of the model.


4. Attention Mechanism (I think this is very distinguished and very useful)

 自然语言处理——Seq2Seq_01_第1张图片




你可能感兴趣的:(自然语言处理——Seq2Seq_01)