zeppelin 0.8环境和spark2.2问题

之前用的版本是zeppelin0.7.2和spark1.6,最近把zeppelin升级到0.8发现1.6的配置和2.2的配置默认都无法支持,大概会遇到3个问题。

  1. Incompatible Jackson version: 2.8.11-1
    jackson版本问题
    rm zeppelin/lib/jackson-*
    cp $spark_home/jars/jackson-* zeppelin/lib/
    就可以解决了

  2. org.apache.spark.network.client.TransportResponseHandler.channelInactive
    这个是因为netty包版本不一致问题导致
    同样
    rm zeppelin/lib/netty*
    cp $spark_home/jars/netty* zeppelin/lib
    搞定了

  3. spark版本未识别
    在zeppelin interpreter 中,配置SPARK_MAJOR_VERSION 值为2
    其他参数如果需要可以参照,按照个人喜好来吧
    Properties
    name value
    SPARK_HOME /data/sysdir/spark2.2
    SPARK_MAJOR_VERSION 2
    args
    master yarn-client
    spark.app.name Zeppelin
    spark.cores.max 3
    spark.driver.extraJavaOptions -XX:PermSize=2g -XX:MaxPermSize=2g
    spark.driver.maxResultSize 8g
    spark.dynamicAllocation.enabled false
    spark.executor.instances 150
    spark.executor.memory 8g
    spark.kryoserializer.buffer.max 2000m
    spark.yarn.executor.memoryOverhead 4086
    spark.yarn.queue xxx.xxx (yarn队列名称)
    yarn.nodemanager.pmem-check-enabled false
    yarn.nodemanager.vmem-check-enabled false
    zeppelin.R.cmd R
    zeppelin.R.image.width 100%
    zeppelin.R.knitr true
    zeppelin.R.render.options out.format = 'html', comment = NA, echo = FALSE, results = 'asis', message = F, warning = F, fig.retina = 2
    zeppelin.dep.additionalRemoteRepository spark-packages,http://dl.bintray.com/spark-packages/maven,false;
    zeppelin.dep.localrepo local-repo
    zeppelin.interpreter.localRepo /data/zeppelin-0.8.0-bin-all/local-repo/3CM5P9NU9
    zeppelin.interpreter.output.limit 10240000
    zeppelin.pyspark.python java
    zeppelin.pyspark.useIPython true
    zeppelin.spark.concurrentSQL false
    zeppelin.spark.enableSupportedVersionCheck true
    zeppelin.spark.importImplicit true
    zeppelin.spark.maxResult 10000
    zeppelin.spark.printREPLOutput true
    zeppelin.spark.sql.interpolation true
    zeppelin.spark.sql.stacktrace true
    zeppelin.spark.uiWebUrl
    zeppelin.spark.useHiveContext true
    zeppelin.spark.useNew true

你可能感兴趣的:(zeppelin 0.8环境和spark2.2问题)