大语言模型(LLM)凤凰 ,训练过程中报错(安装flash_attn过程中报错:No module named ‘torch‘)
安装flash_attn报错,信息如下:pipinstallflash_attnCollectingflash_attnUsingcachedflash_attn-1.0.8.tar.gz(2.0MB)Installingbuilddependencies...doneGettingrequirementstobuildwheel...errorerror:subprocess-exited-wi