python-华为云modelarts的免费codelab运行chatglm2-6b-int4

前提:当前提供 了8核64G的免费体验规格,每天三个小时限额
地址:https://console.huaweicloud.com/modelarts/?region=cn-north-4#/dashboard
下载模型:请参考另一个文章

创建环境(自带环境是pytorch1.8的,所以自己创建)

conda info --env

conda create --name pytorch2.0

conda activate pytorch2.0


安装必要的包

conda install pytorch torchvision torchaudio pytorch-cuda=11.7 -c pytorch -c nvidia

conda install pytorch torchvision torchaudio cpuonly -c pytorch

conda update -n base -c defaults conda

conda install transformers sentencepiece

pip install rouge_chinese cpm_kernels

运行代码

from transformers import AutoTokenizer, AutoModel

tokenizer = AutoTokenizer.from_pretrained("/home/ma-user/work/glm2/chatglm2-6b-int4", 
trust_remote_code=True)

model = AutoModel.from_pretrained("/home/ma-user/work/glm2/chatglm2-6b-int4",trust_remote_code=True).cuda()

model = AutoModel.from_pretrained("/home/ma-user/work/glm2/chatglm2-6b-int4",trust_remote_code=True).float()

model = model.eval()

response, history = model.chat(tokenizer, "晚上睡不着应该怎么办", history=[])

print(response)

你可能感兴趣的:(python,tranformers,python,华为云,开发语言)