解决python3和jupyter-notebook中的报错No module named pyspark和No module named ‘py4j’
背景描述:在centos7–CDH6下配置了spark2.4和hive2.3,在linux-shell中输入pyspark可以正常启动,执行下列语句可正常显示frompyspark.sqlimportSparkSessionspark=SparkSession.builder.appName("PythonSparkSQLHiveintegrationexample").enableHiveSup