[nlp] RuntimeError: Llama is supposed to be a BPE model!报错解决

# tokenizer = AutoTokenizer.from_pretrained(BASE_MODEL)

改成这个legacy=False, use_fast=False

tokenizer = AutoTokenizer.from_pretrained(BASE_MODEL, legacy=False, use_fast=False)

你可能感兴趣的:(nlp,linux)