ImportError: libgfortran.so.4: cannot open shared object file: No such file or directory

问题描述:安装anaconda后,在bashrc文件添加如下设置:

export PATH="/home/hadoop/anaconda3/bin:$PATH"
export PYTHONPATH="/home/hadoop/anaconda3/bin:$PYTHONPATH"

重启系统或者source ~/.bashrc后,输入python,此时默认进入的python版本即是anaconda下的python。
但是在python3 .6下利用pip install sklearn 安装sklearn包后,尝试导入sklearn,报如下错误:

ImportError: libgfortran.so.4: cannot open shared object file: No such file or directory

尝试解决方案一:
利用conda新建一个python3.5的虚拟环境,复用anaconda里的包,命令如下:

conda create -n python35 python=3.5 anaconda

再次运行python,导入sklearn包成功。但是仍然没有解决默认python启动环境下导入失败的问题。

尝试解决方案二:
运行如下命令:

find ~ -name libgfortran.so.4.0.0

find ~ -name libgfortran.so.4

查找出如下内容:

hadoop@Master:~$ find ~ -name libgfortran.so.4
find: `/home/hadoop/.cache/thumbnails': 权限不够
find: `/home/hadoop/.cache/dconf': 权限不够
/home/hadoop/anaconda3/pkgs/libgfortran-ng-7.2.0-hdf63c60_3/x86_64-conda_cos6-linux-gnu/sysroot/lib/libgfortran.so.4
/home/hadoop/anaconda3/pkgs/libgfortran-ng-7.2.0-hdf63c60_3/lib/libgfortran.so.4

hadoop@Master:~$ find ~ -name libgfortran.so.4.0.0
find: `/home/hadoop/.cache/thumbnails': 权限不够
find: `/home/hadoop/.cache/dconf': 权限不够
/home/hadoop/anaconda3/pkgs/libgfortran-ng-7.2.0-hdf63c60_3/x86_64-conda_cos6-linux-gnu/sysroot/lib/libgfortran.so.4.0.0
/home/hadoop/anaconda3/pkgs/libgfortran-ng-7.2.0-hdf63c60_3/lib/libgfortran.so.4.0.0

再运行如下命令:

ln -s /home/hadoop/anaconda3/pkgs/libgfortran-ng-7.2.0-hdf63c60_3/lib/libgfortran.so.4.0.0  /home/hadoop/anaconda3/lib/libgfortran.so.4

重新运行默认python,导入sklearn库,导入成功!

你可能感兴趣的:(python,python)