[linux] ‘LlamaTokenizer‘ object has no attribute ‘sp_model‘

解决方案:

pip install transformers==4.33.2

如果再有ValueError: Non-consecutive added token '' found. Should have index 76524 but has index 0 in saved vocabulary.这种bug

则,重新merge_tokenizer即可解决

你可能感兴趣的:(linux,linux,python,前端)