[论文阅读笔记 --- 13] StructBERT: Incorporating Language Structures into Pre-training for Deep Language Un

Bert 预训练思路

[论文阅读笔记 --- 13] StructBERT: Incorporating Language Structures into Pre-training for Deep Language Un_第1张图片 Bert 预训练模型

Bert 预训练两大子任务

  • Mask Token Prediction: 对于Mask的位置,多分类任务,从此表中预测处正确的词
  • Next Sentence Prediction: 输入两个句子,判断S1和S2是否是上下句的关系。
[论文阅读笔记 --- 13] StructBERT: Incorporating Language Structures into Pre-training for Deep Language Un_第2张图片 Mask Token Prediction
[论文阅读笔记 --- 13] StructBERT: Incorporating Language Structures into Pre-training for Deep Language Un_第3张图片 Next Sentence Prediction

StructBert

[论文阅读笔记 --- 13] StructBERT: Incorporating Language Structures into Pre-training for Deep Language Un_第4张图片

对于单个句子,考虑Word-Leval Prediction

(1) 预测被Mask的词

(2) 选择一些不包含Mask词的连续三元组,打乱三元组的顺序,预测重建该三元组

最后在句子被预测处理过的位置所对应的正确单词。

对于两个句子,考虑Sentence-Level Prediction

[论文阅读笔记 --- 13] StructBERT: Incorporating Language Structures into Pre-training for Deep Language Un_第5张图片

(1) 考虑为三分类任务,给定句子对S1和S2,存在以下三种情况,S2是S1的下一句,S2是S1的上一句,S1和S2没有上下句关系。

你可能感兴趣的:(论文阅读笔记)