大数据IMF传奇行动 Spark history-server 配置 !运维人员的强大工具

配置Spark History Server
 
1. 在Spark的conf目录下/usr/local/spark-1.6.0-bin-hadoop2.6/conf,将spark-defaults.conf.template改名为spark-defaults.conf
 mv spark-defaults.conf.template spark-defaults.conf 
 
 
2. 对spark-defaults.conf 配置
spark.eventLog.enabled           true
spark.eventLog.dir hdfs://Master:9000/historyserverforSpark
spark.history.ui.port            18080
spark.history.fs.logDirectory    hdfs://Master:9000/historyserverforSpark

3.启动history-server
[root@master sbin]# ./start-history-server.sh

4.成功
[root@master sbin]# jps
2691 DataNode
3459 Master
5444 HistoryServer
2836 SecondaryNameNode
2600 NameNode
3512 Worker
5641 Jps
[root@master sbin]#

5.在web浏览器中查看http://192.168.2.100:18080/ 显示页面

1.6.0 History Server

Event log directory: hdfs://Master:9000/historyserverforSpark

 

6.问题解决:
没有配置spark.history.fs.logDirectory    hdfs://Master:9000/historyserverforSpark
启动完成后,发现18080并没有监听,到Spark的logs目录下查看history server的启动日志,
[root@master sbin]# cat /usr/local/spark-1.6.0-bin-hadoop2.6/logs/spark-root-org.apache.spark.deploy.history.HistoryServer-1-master.out
发现报如下错误:
 Caused by: java.lang.IllegalArgumentException: Log directory specified does not exist: file:/tmp/spark-events. Did you configure the correct one through spark.fs.history.logDirectory?

加上spark.history.fs.logDirectory    hdfs://Master:9000/historyserverforSpark就可以了
 

你可能感兴趣的:(Hadoop)