在Windows启动pyspark shell:Failed to find Spark jars directory. You need to build Spark before running ...

D:\Develop tools\spark-2.2.0-bin-hadoop2.7\bin>pyspark2.cmd

'tools\spark-2.2.0-bin-hadoop2.7\bin\..\jars""\' 不是内部或外部命令,也不是可运
行的程序
或批处理文件。
Failed to find Spark jars directory.
You need to build Spark before running this program.

 

错误原因:路径中含有空格(D:\Develop tools\spark-2.2.0-bin-hadoop2.7\bin的Develop tools中间有空格)

转载于:https://www.cnblogs.com/144823836yj/p/11275408.html

你可能感兴趣的:(在Windows启动pyspark shell:Failed to find Spark jars directory. You need to build Spark before running ...)