Paddle踩坑1:libpython3.10.so.1.0

Paddle安装踩坑集锦

遇到的问题:

安装Paddle之后import paddle的时候报了一个错:

>>> import paddle
Error: Can not import paddle core while this file exists: /opt/conda/envs/piddle/lib/python3.10/site-packages/paddle/fluid/libpaddle.so
Traceback (most recent call last):
  File "", line 1, in <module>
  File "/opt/conda/envs/piddle/lib/python3.10/site-packages/paddle/__init__.py", line 25, in <module>
    from .framework import monkey_patch_variable
  File "/opt/conda/envs/piddle/lib/python3.10/site-packages/paddle/framework/__init__.py", line 17, in <module>
    from . import random  # noqa: F401
  File "/opt/conda/envs/piddle/lib/python3.10/site-packages/paddle/framework/random.py", line 16, in <module>
    import paddle.fluid as fluid
  File "/opt/conda/envs/piddle/lib/python3.10/site-packages/paddle/fluid/__init__.py", line 36, in <module>
    from . import framework
  File "/opt/conda/envs/piddle/lib/python3.10/site-packages/paddle/fluid/framework.py", line 37, in <module>
    from . import core
  File "/opt/conda/envs/piddle/lib/python3.10/site-packages/paddle/fluid/core.py", line 304, in <module>
    raise e
  File "/opt/conda/envs/piddle/lib/python3.10/site-packages/paddle/fluid/core.py", line 249, in <module>
    from . import libpaddle
ImportError: libpython3.10.so.1.0: cannot open shared object file: No such file or directory
>>>

主要核心就是:libpython3.10.so.1.0: cannot open shared object file: No such file or directory

解决方式

第一步:

终端中输入:

find / -name libpython3.10.so.1.0

然后会得到一个反馈结果:

(nlp) root@n1:~# find / -name libpython3.7m.so.1.0
find: ‘/home/peterpark’: Permission denied
/opt/conda/envs/nlp/lib/libpython3.7m.so.1.0
/opt/conda/lib/libpython3.7m.so.1.0
/opt/conda/pkgs/python-3.7.11-h12debd9_0/lib/libpython3.7m.so.1.0
/opt/conda/pkgs/python-3.7.15-h7a1cb2a_1/lib/libpython3.7m.so.1.0
find: ‘/proc/tty/driver’: Permission denied
find: ‘/proc/27/map_files’: Permission denied

第二步:

选中一个目录,执行下面操作:

cp /opt/conda/envs/nlp/lib/libpython3.7m.so.1.0 /usr/lib

然后问题就解决啦!码字不易点个赞!!!

你可能感兴趣的:(paddle,python,深度学习)