Error: Could not find or load main class org.apache.spark.deploy.yarn.ApplicationMaster

今天搭建了一个sparkOnYarn的集群,我想测试一下集群是否正常,使用spark自带蒙特卡罗求圆周率的算法包,但是执行报错

执行命令:

bin/spark-submit \
--class org.apache.spark.examples.SparkPi \
--master yarn  --deploy-mode cluster \
--executor-memory 1G \
--total-executor-cores 2 \
hdfs://rootcloud/system/sparkJar/jars/spark-examples_2.11-2.2.1.jar \
100

shell中报错内容:

Stack trace: ExitCodeException exitCode=1:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:575)
at org.apache.hadoop.util.Shell.run(Shell.java:478)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:766)
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:212)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Container exited with a non-zero exit code 1
Failing this attempt. Failing the application.

被这个报错迷惑很久,最后去yarn中看日志,发现报的根本不是这个问题,yarn中的报错为

“Error: Could not find or load main class org.apache.spark.deploy.yarn.ApplicationMaster”

 

解决办法(以下为步骤):

1.在spark-default.conf增加配置

#spark的代码列表,在hdfs中

spark.yarn.jars hdfs://hadoop-master01:9000/system/sparkJar/jars/*.jar

2.上传jar包,把本地的jar上传到hdfs中

hadoop fs -put $SPARK_HOME/jars/*  hdfs://hadoop-master01:9000/system/sparkJar/jars/

 

 

你可能感兴趣的:(大数据)