大语言模型(LLM)凤凰 ,训练过程中报错(安装flash_attn过程中报错:No module named ‘torch‘)

安装flash_attn报错,信息如下:

pip install flash_attn
Collecting flash_attn
  Using cached flash_attn-1.0.8.tar.gz (2.0 MB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... error
  error: subprocess-exited-with-error
  
  × Getting requirements to build wheel did not run successfully.
  │ exit code: 1
  ╰─> [15 lines of output]
      Traceback (most recent call last):
        File "/home/aaa/anaconda3/envs/fenghuang/lib/python3.9/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 353, in
          main()
        File "/home/aaa/anaconda3/envs/fenghuang/lib/python3.9/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 335, in main
          json_out['return_val'] = hook(**hook_input['kwargs'])
        File "/home/aaa/anaconda3/envs/fenghuang/lib

你可能感兴趣的:(语言模型,python,深度学习)