py4j.protocol.Py4JError: org.apache.spark.api.python.PythonUtils... does not exist in the JVM

安装环境:

Win7 +anaconda4.3.21(python3.6.1)+spark2.3.2+java1.8

执行程序:

from pyspark import SparkContext
from pyspark import SparkConf
conf = SparkConf().setAppName("miniProject").setMaster("local[*]")
sc=SparkContext.getOrCreate(conf)
rdd=sc.parallelize([1,2,3,4,5])
rdd1=rdd.map(lambda r:r+10)
print(rdd1.collect())

出现如下错误信息:

To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Traceback (most recent call last):
  File "C:/Users/Administrator/PycharmProjects/untitled/Learn/fibonaqie.py", line 6, in
    sc=SparkContext.getOrCreate(conf)
  File "C:\ProgramData\Anaconda3\lib\site-packages\pyspark\context.py", line 349, in getOrCreate
    SparkContext(conf=conf or SparkConf())
  File "C:\ProgramData\Anaconda3\lib\site-packages\pyspark\context.py", line 118, in __init__
    conf, jsc, profiler_cls)
  File "C:\ProgramData\Anaconda3\lib\site-packages\pyspark\context.py", line 195, in _do_init
    self._encryption_enabled = self._jvm.PythonUtils.getEncryptionEnabled(self._jsc)
  File "C:\ProgramData\Anaconda3\lib\site-packages\py4j\java_gateway.py", line 1487, in __getattr__
    "{0}.{1} does not exist in the JVM".format(self._fqn, name))
py4j.protocol.Py4JError: org.apache.spark.api.python.PythonUtils.getEncryptionEnabled does not exist in the JVM

 

解决方法:

在代码中添加

import findspark
findspark.init()

问题解决。执行代码返回信息

2019-01-04 12:51:20 WARN  Utils:66 - Your hostname, master resolves to a loopback address: 127.0.0.1; using 192.168.0.66 instead (on interface eth3)
2019-01-04 12:51:20 WARN  Utils:66 - Set SPARK_LOCAL_IP if you need to bind to another address
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
[11, 12, 13, 14, 15]

你可能感兴趣的:(linux,Centos7,spark,python)