CODE LLM 对比

CODE LLM

Model 参数 模型大小 模型准确率(Pass@1) 发布时间 License 机构 GPU消耗 Respository
CodeGen-16B-multi 160亿
27.5G
19.2
2022-04-01

免费商用授权

Salesforce
https://huggingface.co/Salesforce/codegen-16B-multi/tree/main
https://github.com/salesforce/CodeGen
CodeGeeX-13B 130亿 22.9 2022-09-30 开源 清华大学

https://github.com/THUDM/CodeGeeX

https://huggingface.co/spaces/THUDM/CodeGeeX

Codex-12B 120亿 28.8 不开源 OpenAI
CodeT5Plus-16B-mono 160亿 41GB 30.9 2023-05-13 免费商用授权
Salesforce
https://github.com/salesforce/CodeT5https://huggingface.co/Salesforce/codet5p-16b
Code-Cushman-001 33.5 不开源 OpenAI
LLaMA-65B 650亿 120GB 23.7 2023-02-24 开源不可商用 Meta https://github.com/facebookresearch/llama
LLaMA2-70B 700亿 129GB 29.9
2023-07-18
免费商用授权 Meta https://github.com/facebookresearch/llamahttps://huggingface.co/meta-llama/Llama-2-70b
CodeGen2.5-7B-mono 70亿 27GB 33.4 2023-07-07 免费商用授权 Salesforce https://github.com/salesforce/CodeGenhttps://huggingface.co/Salesforce/codegen25-7b-multi
StarCoder-15B 150亿 64GB 33.2 2023-05-05 免费商用授权 BigCode https://huggingface.co/bigcode/starcoderhttps://github.com/bigcode-project/starcoder/tree/main
CodeGeeX2-6B 60亿 12.5GB 35.9 2023-07-25 免费商用授权 清华大学

GPU>13G

内存14G

https://github.com/salesforce/CodeGen

https://huggingface.co/Salesforce/codegen25-7b-multi

GPT-3.5 - OpenAI-
175B
1750亿 48.1
2022-11-30
不开源 OpenAI

WizardCoder-15B 150亿 31GB 57.3  2023-06-14 免费商用授权 微软 内存40G

https://github.com/nlpxucan/WizardLM

https://huggingface.co/WizardLM/WizardCoder-15B-V1.0

PanGu-Coder2-150B 1500亿 61.64
2023-07-27
不开源 华为 https://arxiv.org/pdf/2307.14936.pdf
GPT-4 - OpenAI-175B 1750亿 67.0
2023-03-14
不开源 OpenAI https://cdn.openai.com/papers/gpt-4.pdf
Qwen-7B 70亿 15.4GB 2023-08-03 免费商用授权 阿里

 GPU> 23g

Qwen/Qwen-7B · Hugging Face
https://github.com/QwenLM/Qwen-7B
ChatGLM-6B 62亿 8GB 2023-03-14 清华大学 https://github.com/THUDM/ChatGLM-6B
https://huggingface.co/THUDM/chatglm-6b

reference:

https://github.com/abacaj/code-eval

https://lmsys.org/

Chatbot Arena Leaderboard - a Hugging Face Space by lmsys

https://huggingface.co/WizardLM/WizardLM-30B-V1.0

https://github.com/QwenLM/Qwen-7B

https://github.com/THUDM/ChatGLM2-6B

https://www.datalearner.com/ai-models/pretrained-models?&aiArea=1002&language=-1&contextLength=-1&openSource=-1&publisher=-1

你可能感兴趣的:(LLM,大模型)