CDH RM查看 History

根据CDH文档, 在spark-default.xml中配置的history地址18088 , 发现无法跳转到相应地址, 而历史日志在18080这个地址,经查,默认地址应该是18080 ; 修改后, 重启, 重新任务,可以正确跳转。

但是: 18080 地址中无法显示信息, 经查, 是因为目录权限问题。

spark History ui在访问历史记录的时候,使用的spark用户;

org.apache.hadoop.security.AccessControlException: Permission denied: user=spark, access=READ, inode="/user/spark/applicationHistory/application_1525767620603_0023":du:spark:-rwxrwx--T
        at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkFsPermission(DefaultAuthorizationProvider.java:279)
        at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.check(DefaultAuthorizationProvider.java:260)
        at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:168)

生成历史记录时使用的是操作用户;

CDH RM查看 History_第1张图片
图片.png
-bash-4.1$ hadoop fs -ls /user/spark/applicationHistory/
Found 10 items
-rwxrwxrwt   3 du    spark      74485 2018-05-10 11:12 /user/spark/applicationHistory/application_1525767620603_0021
-rwxrwxrwt   3 du    spark      74498 2018-05-10 11:37 /user/spark/applicationHistory/application_1525767620603_0022
-rwxrwx--T   3 du    spark      74498 2018-05-10 11:40 /user/spark/applicationHistory/application_1525767620603_0023
-rwxrwxr--   3 root  spark      74407 2018-05-10 13:30 /user/spark/applicationHistory/application_1525767620603_0024
-rwxrwx---   3 du    spark      74485 2018-05-10 14:29 /user/spark/applicationHistory/application_1525767620603_0025
-rwxrwx---   3 spark spark      74628 2018-05-10 14:56 /user/spark/applicationHistory/application_1525767620603_0026
-rwxrwxrwt   3 root  spark      52761 2018-05-10 10:00 /user/spark/applicationHistory/local-1525917054050
-rwxrwxrwt   3 root  spark      52754 2018-05-10 10:03 /user/spark/applicationHistory/local-1525917724898
-rwxrwxrwt   3 root  spark      52782 2018-05-10 10:08 /user/spark/applicationHistory/local-1525917941187
-rwxrwxrwt   3 du    spark      72949 2018-05-10 10:37 /user/spark/applicationHistory/local-1525918259682

所以: 在客户端上创建spark 用户, 以spark用户运行spark程序即可。

你可能感兴趣的:(CDH RM查看 History)