使用ambari启动Spark Thrift Server时报错:bad substitution

系统信息:
centos 6.8final
ambari 版本:2.2.1
hdp版本:2.7.1.2.3.2.0-2950
原因:
spark on yarn 没有接收到hdp.version
解决方法:
通过ambari管理界面修改MapReduce2 配置:
MapReduce2→configs→Advanced→Advanced mapred-site:
修改: mapreduce.application.classpath 这个参数

原始值为:

$PWD:$PWD/spark.jar:$HADOOP_CONF_DIR:/usr/hdp/current/hadoop-client/:/usr/hdp/current/hadoop-client/lib/:/usr/hdp/current/hadoop-hdfs-client/:/usr/hdp/current/hadoop-hdfs-client/lib/:/usr/hdp/current/hadoop-yarn-client/:/usr/hdp/current/hadoop-yarn-client/lib/:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/:$PWD/mr-framework/hadoop/share/hadoop/common/:$PWD/mr-framework/hadoop/share/hadoop/common/lib/:$PWD/mr-framework/hadoop/share/hadoop/yarn/:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/:$PWD/mr-framework/hadoop/share/hadoop/hdfs/:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/:/usr/hdp/${hdp.version}/hadoop/lib/hadoop-lzo-0.6.0.${hdp.version}.jar:/etc/hadoop/conf/secure

替换为(注意不要复制空格):

$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/2.3.2.0-2950/hadoop/lib/hadoop-lzo-0.6.0.2.3.2.0-2950.jar:/etc/hadoop/conf/secure

错误日志:

INFO Client: 
     client token: N/A
     diagnostics: N/A
     ApplicationMaster host: N/A
     ApplicationMaster RPC port: -1
     queue: default
     start time: 1491793387707
     final status: UNDEFINED
     tracking URL: http://hdp-namenode:8088/proxy/application_1491793125407_0001/
     user: hive
17/04/10 11:03:09 INFO Client: Application report for application_1491793125407_0001 (state: ACCEPTED)
17/04/10 11:03:10 INFO Client: Application report for application_1491793125407_0001 (state: ACCEPTED)
17/04/10 11:03:11 INFO Client: Application report for application_1491793125407_0001 (state: ACCEPTED)
17/04/10 11:03:12 INFO Client: Application report for application_1491793125407_0001 (state: ACCEPTED)
17/04/10 11:03:13 INFO Client: Application report for application_1491793125407_0001 (state: ACCEPTED)
17/04/10 11:03:14 INFO Client: Application report for application_1491793125407_0001 (state: ACCEPTED)
17/04/10 11:03:15 INFO Client: Application report for application_1491793125407_0001 (state: ACCEPTED)
17/04/10 11:03:17 INFO Client: Application report for application_1491793125407_0001 (state: ACCEPTED)
17/04/10 11:03:18 INFO Client: Application report for application_1491793125407_0001 (state: ACCEPTED)
17/04/10 11:03:19 INFO Client: Application report for application_1491793125407_0001 (state: ACCEPTED)
17/04/10 11:03:20 INFO Client: Application report for application_1491793125407_0001 (state: ACCEPTED)
17/04/10 11:03:21 INFO Client: Application report for application_1491793125407_0001 (state: ACCEPTED)
17/04/10 11:03:22 INFO Client: Application report for application_1491793125407_0001 (state: ACCEPTED)
17/04/10 11:03:23 INFO Client: Application report for application_1491793125407_0001 (state: ACCEPTED)
17/04/10 11:03:24 INFO Client: Application report for application_1491793125407_0001 (state: ACCEPTED)
17/04/10 11:03:25 INFO Client: Application report for application_1491793125407_0001 (state: FAILED)
17/04/10 11:03:25 INFO Client: 
     client token: N/A
     diagnostics: Application application_1491793125407_0001 failed 2 times due to AM Container for appattempt_1491793125407_0001_000002 exited with  exitCode: 1
For more detailed output, check application tracking page:http://hdp-namenode:8088/cluster/app/application_1491793125407_0001Then, click on links to logs of each attempt.
Diagnostics: Exception from container-launch.
Container id: container_e50_1491793125407_0001_02_000001
Exit code: 1
Exception message: /opt/hadoop/yarn/local/usercache/hive/appcache/application_1491793125407_0001/container_e50_1491793125407_0001_02_000001/launch_container.sh: line 24: $PWD:$PWD/__hadoop_conf__:$PWD/__spark__.jar:$HADOOP_CONF_DIR:/usr/hdp/current/hadoop-client/*:/usr/hdp/current/hadoop-client/lib/*:/usr/hdp/current/hadoop-hdfs-client/*:/usr/hdp/current/hadoop-hdfs-client/lib/*:/usr/hdp/current/hadoop-yarn-client/*:/usr/hdp/current/hadoop-yarn-client/lib/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/${hdp.version}/hadoop/lib/hadoop-lzo-0.6.0.${hdp.version}.jar:/etc/hadoop/conf/secure: bad substitution

Stack trace: ExitCodeException exitCode=1: /opt/hadoop/yarn/local/usercache/hive/appcache/application_1491793125407_0001/container_e50_1491793125407_0001_02_000001/launch_container.sh: line 24: $PWD:$PWD/__hadoop_conf__:$PWD/__spark__.jar:$HADOOP_CONF_DIR:/usr/hdp/current/hadoop-client/*:/usr/hdp/current/hadoop-client/lib/*:/usr/hdp/current/hadoop-hdfs-client/*:/usr/hdp/current/hadoop-hdfs-client/lib/*:/usr/hdp/current/hadoop-yarn-client/*:/usr/hdp/current/hadoop-yarn-client/lib/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/*:$PWD/mr-framework/hadoop/share/hadoop/mapreduce/lib/*:$PWD/mr-framework/hadoop/share/hadoop/common/*:$PWD/mr-framework/hadoop/share/hadoop/common/lib/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/*:$PWD/mr-framework/hadoop/share/hadoop/yarn/lib/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/*:$PWD/mr-framework/hadoop/share/hadoop/hdfs/lib/*:$PWD/mr-framework/hadoop/share/hadoop/tools/lib/*:/usr/hdp/${hdp.version}/hadoop/lib/hadoop-lzo-0.6.0.${hdp.version}.jar:/etc/hadoop/conf/secure: bad substitution

    at org.apache.hadoop.util.Shell.runCommand(Shell.java:576)
    at org.apache.hadoop.util.Shell.run(Shell.java:487)
    at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:753)
    at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:211)
    at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
    at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)


Container exited with a non-zero exit code 1
Failing this attempt. Failing the application.
     ApplicationMaster host: N/A
     ApplicationMaster RPC port: -1
     queue: default
     start time: 1491793387707
     final status: FAILED
     tracking URL: http://hdp-namenode:8088/cluster/app/application_1491793125407_0001
     user: hive

你可能感兴趣的:(spark-on-yarn,thrift-Ser,bad-subs)