spark-submit yarn错误解决

Spark 集群提交模式

Spark一般常用三种提交模式,local、独立集群、yarn

1.提交配置优先级

应用里面set的优先级>spark_submit
参考:http://blog.csdn.net/xiaolang85/article/details/51364259

2.独立集群

(前提是应用程序里没有配置setMaster或者将该值设置为master)
Spark-submit --mater spark://master:7077 XXX
打开master:4040查看environment

spark-submit yarn错误解决_第1张图片

Spark.master即为独立集群模式

3.yarn模式

提交方式:
Spark-submit --mater yarn XXX


spark-submit yarn错误解决_第2张图片

各种报错,
第一个错误:
Yarn application has already ended! It might have been killed or unable to launch application master
解决:
检查配置文件,在spark-env.sh中要配置SPARK_HOME和 YARN_CONF_DIR

export SPARK_HOME=$HOME/spark
export YARN_CONF_DIR=$HADOOP_HOME/etc/hadoop

第 二个错误:
Failed to send RPC
原因: the Java 8 excessive memory allocation issue:
解决:在yarn-site.xml添加


yarn.nodemanager.pmem-check-enabled
false


yarn.nodemanager.vmem-check-enabled
false

参考:
http://stackoverflow.com/questions/39467761/how-to-know-what-is-the-reason-for-closedchannelexceptions-with-spark-shell-in-y

你可能感兴趣的:(spark-submit yarn错误解决)