spark-shell启动报错解决办法

spark-shell启动报错解决办法:
scala版本不兼容问题
这是因为加入了项目依赖库到/usr/cwgis/app/spark/jars/lib/中
删除相关的scala开头的jar文件即可启动spark-shell

[root@node111 ~]# runCmd.sh "rm /usr/cwgis/app/spark/jars/lib/scala*.jar" all

错误: 找不到或无法加载主类 org.apache.spark.deploy.yarn.ExecutorLauncher

SparkException: Yarn application has already ended! It might have been killed or unable to launch application master
spark1.0版本

[hadoop@localhost spark-1.0.1-bin-hadoop2]$ export SPARK_JAR=lib/spark-assembly-1.0.1-hadoop2.2.0.jar 

spark other version

创建压缩jar文件方法:
生成一个spark-libs.jar文件,打包/spark/jars目录下所有jar文件和子目录jar文件

jar cv0f spark-libs.jar -C /usr/cwgis/app/spark/jars/ .

在spark-default.conf中设置 spark.yarn.archive=hdfs://mycluster:8020/spark/spark-libs.jar
或者 spark.yarn.jars=hdfs://mycluster:8020/spark/spark-libs.jar

你可能感兴趣的:(spark)