ValueError: The model‘s vocab size is set to -1 in params.json. 部署llama-2-chat-7B

部署llama-2-chat-7B模型时,遇到以下问题
输入以下命令

python3 convert.py --outfile ./models/llama-2-7b-chat ../llama/llama-2-7b-chat/

出现以下问题。
ValueError: The model‘s vocab size is set to -1 in params.json. 部署llama-2-chat-7B_第1张图片

Traceback (most recent call last):
  File "/home/zack/llama.cpp/convert.py", line 1658, in 
    main(sys.argv[1:])  # Exclude the first element (script name) from sys.argv
    ^^^^^^^^^^^^^^^^^^
  File "/home/zack/llama.cpp/convert.py", line 1643, in main
    OutputFile.write_all(
  File "/home/zack/llama.cpp/convert.py", line 1188, in write_all
    check_vocab_size(params, vocab, pad_vocab=pad_vocab)
  File "/home/zack/llama.cpp/convert.py", line 993, in check_vocab_size
    raise ValueError(
ValueError: The model's vocab size is set to -1 in params.json. Please update it manually. Maybe 32000?

那么应该首先去原来的llama-2-7b-chat的文件夹下,修改params.json。如果没有这个文件,找后缀是json的就行。
ValueError: The model‘s vocab size is set to -1 in params.json. 部署llama-2-chat-7B_第2张图片
打开params.json

nano params.json

把vocab_size改为32000
在这里插入图片描述

python3 convert.py --outfile ./models/llama-2-7b-chat ../llama/llama-2-7b-chat/

回到llama.cpp文件夹下,再次输入convert命令即可

python3 convert.py --outfile ./models/llama-2-7b-chat ../llama/llama-2-7b-chat/

你可能感兴趣的:(llama,linux,json,语言模型)