【无标题】

Llava 环境构建遇到flash-attn问题解决

  1. flash-attn安装
  • pip 安装
# 使用pip 安装
pip install flash-attn --no-build-isolation
  • 使用whl文件离线安装
    下载flash-attn的离线whl文件安装
    如下图:用图片中的地址下载whl文件
  1. flash-attn安装不当导致的问题
  • pip安装的时候会出现这个问题,这时可以使用上述中whl的方式安装
Building wheel for flash-attn (setup.py) ... error
  error: subprocess-exited-with-error
  
  × python setup.py bdist_wheel did not run successfully.
  │ exit code: 1
  ╰─> [9 lines of output]
      fatal: not a git repository (or any of the parent directories): .git
      
      
      torch.__version__  = 2.2.0+cu121
      
      
      running bdist_wheel
      Guessing wheel URL:  https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.2/flash_attn-2.5.2+cu122torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
      error: <urlopen error [Errno 110] Connection timed out>
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
  ERROR: Failed building wheel for flash-attn
  Running setup.py clean for flash-attn
Failed to build flash-attn

  • flash-attn安装不当导致的运行代码出现问题。
    解决方法:重新安装flash-attn,推荐使用whl方式安装,whl文件下载见上文。
ImportError: cannot import name 'LlavaLlamaForCausalLM' from 'llava.model' (/root/LLaVA/llava/model/__init__.py)

你可能感兴趣的:(python)