Oozie 4.3.0运行Error: …

fs://master2host:9000/user/master2/share/lib/spark/py4j-0.9.jar,hdfs://master2host:9000/user/master2/share/lib/spark/avro-ipc-1.7.7-tests.jar,hdfs://master2host:9000/user/master2/share/lib/spark/quasiquotes_2.10-2.0.0-M8.jar,hdfs://master2host:9000/user/master2/share/lib/spark/scalap-2.10.0.jar,hdfs://master2host:9000/user/master2/share/lib/spark/spark-streaming-flume_2.10-1.6.1.jar,hdfs://master2host:9000/user/master2/share/lib/spark/scala-library-2.10.5.jar,hdfs://master2host:9000/user/master2/share/lib/spark/jaxb-api-2.2.2.jar,hdfs://master2host:9000/user/master2/share/lib/spark/kafka-clients-0.8.2.1.jar,hdfs://master2host:9000/user/master2/share/lib/spark/kryo-2.22.jar,hdfs://master2host:9000/user/master2/share/lib/spark/slf4j-log4j12-1.6.6.jar,hdfs://master2host:9000/user/master2/share/lib/spark/jodd-core-3.5.2.jar,hdfs://master2host:9000/user/master2/share/lib/spark/commons-codec-1.4.jar,hdfs://master2host:9000/user/master2/share/lib/spark/jackson-databind-2.4.4.jar,hdfs://master2host:9000/user/master2/share/lib/spark/jetty-6.1.14.jar,hdfs://master2host:9000/user/master2/share/lib/spark/curator-recipes-2.5.0.jar,hdfs://master2host:9000/user/master2/share/lib/spark/log4j-1.2.17.jar,hdfs://master2host:9000/user/master2/share/lib/spark/spark-graphx_2.10-1.6.1.jar,hdfs://master2host:9000/user/master2/share/lib/spark/avro-1.7.7.jar,hdfs://master2host:9000/user/master2/share/lib/spark/parquet-column-1.7.0.jar,hdfs://master2host:9000/user/master2/share/lib/spark/spark-streaming_2.10-1.6.1.jar,hdfs://master2host:9000/user/master2/share/lib/spark/spark-unsafe_2.10-1.6.1.jar,hdfs://master2host:9000/user/master2/share/lib/spark/spark-launcher_2.10-1.6.1.jar,hdfs://master2host:9000/user/master2/share/lib/spark/commons-logging-1.1.jar,hdfs://master2host:9000/user/master2/share/lib/spark/jetty-util-6.1.26.jar,hdfs://master2host:9000/user/master2/share/lib/spark/tachyon-underfs-hdfs-0.8.2.jar,hdfs://master2host:9000/user/master2/share/lib/spark/parquet-hadoop-1.7.0.jar,hdfs://master2host:9000/user/master2/share/lib/spark/avro-ipc-1.7.7.jar,hdfs://master2host:9000/user/master2/share/lib/oozie/json-simple-1.1.jar,hdfs://master2host:9000/user/master2/share/lib/oozie/oozie-hadoop-utils-hadoop-2-4.3.0.jar,hdfs://master2host:9000/user/master2/share/lib/oozie/oozie-sharelib-oozie-4.3.0.jar
  pyFiles                 file:/home/master2/hadoop_tmp/nm-local-dir/usercache/master2/appcache/application_1487340758413_0003/container_1487340758413_0003_01_000001/pyspark.zip,file:/home/master2/hadoop_tmp/nm-local-dir/usercache/master2/appcache/application_1487340758413_0003/container_1487340758413_0003_01_000001/py4j.zip
  archives                null
  mainClass               null
  primaryResource         hdfs://master2host:9000/user/master2/examples/apps/pythonApp/lib/spark1.py
  name                    Spark-python
  childArgs               []
  jars                    null
  packages                null
  packagesExclusions      null
  repositories            null
  verbose                 true

Spark properties used, including those specified through
 --conf and those from the properties file null:
  spark.yarn.security.tokens.hive.enabled -> false
  spark.yarn.jar -> null
  spark.yarn.tags -> oozie-716cb74f8eb05f10a1382d221c0f2c90
  spark.executor.extraJavaOptions -> -Dlog4j.configuration=spark-log4j.properties
  spark.yarn.security.tokens.hbase.enabled -> false
  spark.driver.extraJavaOptions -> -Dlog4j.configuration=spark-log4j.properties
  spark.executor.extraClassPath -> $PWD/*
  spark.driver.extraClassPath -> $PWD/*

    
Error: Could not load YARN classes. This copy of Spark may not have been compiled with YARN support.
Run with --help for usage help or --verbose for debug output
Intercepting System.exit(1)
Failing Oozie Launcher, Main class [org.apache.oozie.action.hadoop.SparkMain], exit code [1]
log4j:WARN No appenders could be found for logger (org.apache.hadoop.mapreduce.v2.app.MRAppMaster).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.

  我的spark使用的是2.11版本。
  遇到这样的问题,需要更新下hdfs里面的oozie的sharelib,尤其spark文件夹下的。之前spark文件夹下一直是oozie默认编译出来的库,不是最新的2.11,因此找不到spark与yarn相关的jar。
然而它居然自作主张地提示错误说可能spark没有在添加yarn支持的情况下编译。我明明下载的版本就是spark on hadoop的,怎么可能不支持?!
  其实就是oozie自己类库不支持。这个oozie逻辑真是想不通了,你需要别人的类库,直接链接过去用就完了,为何要自己编译一遍?!如果说你非得上传到hdfs中,那也是应该通过shell命令一股脑地拷贝过来啊!
  这些配置过程全部都应该自动化了。一个比较好的思维就是,我系统里有什么,你就拿什么,别自作主张拿其它版本的东西。

与此同时,在oozie-site.xml配置文件里面,也得加上如下支持,才能让spark跑得比较完美:
 
         oozie.service.SparkConfigurationService.spark.configurations  
         *=/usr/local/spark/conf  
     
  
     
         oozie.service.WorkflowAppService.system.libpath  
         /user/oozie/share/lib  
     
  
     
        oozie.use.system.libpath  
        true  
         
                Default value of oozie.use.system.libpath. If user haven't specified =oozie.use.system.libpath=  
                in the job.properties and this value is true and Oozie will include sharelib jars for workflow.  
         
     

你可能感兴趣的:(【C,Java与网络编程】)