Spark任务提交时报错bad substitution

Spark任务提交时报错bad substitution

问题描述

在集群中使用spark-submit提交spark任务时,报错

Exception message: /hadoop/yarn/local/usercache/qxadmin/appcache/application_1631068541144_0002/container_e10_1631068541144_0002_01_000001/launch_container.sh:行22: P W D : PWD: PWD:PWD/spark_conf:KaTeX parse error: Expected group after '_' at position 5: PWD/_̲_spark_libs__/*…HADOOP_CONF_DIR:/usr/hdp/current/hadoop-client/:/usr/hdp/current/hadoop-client/lib/:/usr/hdp/current/hadoop-hdfs-client/:/usr/hdp/current/hadoop-hdfs-client/lib/:/usr/hdp/current/hadoop-yarn-client/:/usr/hdp/current/hadoop-yarn-client/lib/: P W D / m r − f r a m e w o r k / h a d o o p / s h a r e / h a d o o p / m a p r e d u c e / ∗ : PWD/mr-framework/hadoop/share/hadoop/mapreduce/*: PWD/mrframework/hadoop/share/hadoop/mapreduce/:PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/: P W D / m r − f r a m e w o r k / h a d o o p / s h a r e / h a d o o p / c o m m o n / ∗ : PWD/mr-framework/hadoop/share/hadoop/common/*: PWD/mrframework/hadoop/share/hadoop/common/:PWD/mr-framework/hadoop/share/hadoop/common/lib/: P W D / m r − f r a m e w o r k / h a d o o p / s h a r e / h a d o o p / y a r n / ∗ : PWD/mr-framework/hadoop/share/hadoop/yarn/*: PWD/mrframework/hadoop/share/hadoop/yarn/:PWD/mr-framework/hadoop/share/hadoop/yarn/lib/: P W D / m r − f r a m e w o r k / h a d o o p / s h a r e / h a d o o p / h d f s / ∗ : PWD/mr-framework/hadoop/share/hadoop/hdfs/*: PWD/mrframework/hadoop/share/hadoop/hdfs/:PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/: P W D / m r − f r a m e w o r k / h a d o o p / s h a r e / h a d o o p / t o o l s / l i b / ∗ : / u s r / h d p / PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/ PWD/mrframework/hadoop/share/hadoop/tools/lib/:/usr/hdp/{hdp.version}/hadoop/lib/hadoop-lzo-0.6.0. h d p . v e r s i o n . j a r : / e t c / h a d o o p / c o n f / s e c u r e : {hdp.version}.jar:/etc/hadoop/conf/secure: hdp.version.jar:/etc/hadoop/conf/secure:PWD/spark_conf/hadoop_conf: 坏的替换

如下图所示:

Spark任务提交时报错bad substitution_第1张图片

产生原因

hadoop的mapred-site.xml文件中中引用了hdp.version变量,但是该变量在配置中找不到。

解决方案

1.在集群管理页面中,yarn组件及MapReduce2组件,手动添加hdp.version变量。值需要去集群找,如下图。

Spark任务提交时报错bad substitution_第2张图片

在这里插入图片描述

2.客户端机器上, /usr/hdp/2.5.0.0-1245/spark2/conf/spark-default.conf 路径下,添加

spark.driver.extraJavaOptions -Dhdp.version=2.5.0.0-1245
spark.yarn.am.extraJavaOptions -Dhdp.version=2.5.0.0-1245

3.客户端机器上,找到yarn和MapReduce2组件的mapred-site.xml文件和yarn-site.xml文件,在文件里分别加上

    <property>
        <name>hdp.versionname>
        <value>2.7.3.2.5.0.0-1245value>
    property>

4.服务器集群三台,/usr/hdp/2.5.0.0-1245/spark2/conf/spark-default.conf路径下,都添加

spark.driver.extraJavaOptions -Dhdp.version=2.5.0.0-1245
spark.yarn.am.extraJavaOptions -Dhdp.version=2.5.0.0-1245

traJavaOptions -Dhdp.version=2.5.0.0-1245


不同的hdp集群可能有不同的情况,目前我这里少了任何一个环节都不行。

你可能感兴趣的:(Spark,spark)