【NLP笔记】预训练+微调范式之OpenAI Transformer、ELMo、ULM-FiT、Bert..
文章目录OpenAITransformerELMoULM-FiTBert基础结构Embedding预训练&微调【原文链接】:BERT:Pre-trainingofDeepBidirectionalTransformersforLanguageUnderstanding【本文参考链接】TheIllustratedBERT,ELMo,andco.(HowNLPCrackedTransferLearni