Some weights of the model checkpoint at bert_pretrain were not used when initializing BertModel

报错信息Some weights of the model checkpoint at ./bert_pretrain were not used when initializing BertModel:
Some weights of the model checkpoint at bert_pretrain were not used when initializing BertModel_第1张图片说明对应加载的预训练模型与任务类型不完全对应。要么出现有些参数用不到的情况,要么出现有些参数没有、需要随机初始化的情况。在运行的模型页添加代码如下:

from transformers import logging
logging.set_verbosity_error()

如图:
Some weights of the model checkpoint at bert_pretrain were not used when initializing BertModel_第2张图片问题解决:
Some weights of the model checkpoint at bert_pretrain were not used when initializing BertModel_第3张图片

你可能感兴趣的:(自然语言处理,深度学习)