zeppelin + spark 遇到的坑

1.###报错:java.lang.NoSuchFieldError: HIVE_STATS_JDBC_TIMEOUT
折腾了一整天发现是spark的客户端没有安装好,重新安装后修复问题
2.###报错

org.apache.spark.SparkException: Found both spark.driver.extraClassPath and SPARK_CLASSPATH. Use only the former.
  at org.apache.spark.SparkConf$$anonfun$validateSettings$7$$anonfun$apply$8.apply(SparkConf.scala:543)
  at org.apache.spark.SparkConf$$anonfun$validateSettings$7$$anonfun$apply$8.apply(SparkConf.scala:541)
  at scala.collection.immutable.List.foreach(List.scala:381)
  at org.apache.spark.SparkConf$$anonfun$validateSettings$7.apply(SparkConf.scala:541)
  at org.apache.spark.SparkConf$$anonfun$validateSettings$7.apply(SparkConf.scala:529)
  at scala.Option.foreach(Option.scala:257)
  at org.apache.spark.SparkConf.validateSettings(SparkConf.scala:529)
  at org.apache.spark.SparkContext.(SparkContext.scala:368)
  at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2256)
  at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:831)
  at org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:823)
  at scala.Option.getOrElse(Option.scala:121)
  at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823)
  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
  at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  at java.lang.reflect.Method.invoke(Method.java:606)
  at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:38)
  at org.apache.zeppelin.spark.Utils.invokeMethod(Utils.java:33)
  at org.apache.zeppelin.spark.SparkInterpreter.createSparkSession(SparkInterpreter.java:368)
  at org.apache.zeppelin.spark.SparkInterpreter.getSparkSession(SparkInterpreter.java:233)
  at org.apache.zeppelin.spark.SparkInterpreter.open(SparkInterpreter.java:841)
  at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
  at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:491)
  at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
  at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
  at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
  at java.util.concurrent.FutureTask.run(FutureTask.java:262)
  at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
  at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
  at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
  at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
  at java.lang.Thread.run(Thread.java:745)
 INFO [2017-11-27 10:36:49,880] ({pool-2-thread-4} Logging.scala[logInfo]:54) - Successfully stopped SparkContext

需要修改修改bin/interpreter.sh
去除 –driver-class-path” ZEPPELINCLASSPATHOVERRIDES: {CLASSPATH}”
3.###报错

  at org.apache.spark.network.client.TransportResponseHandler.handle(TransportResponseHandler.java:223)
  at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:121)
  at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51)
  at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
  at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
  at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
  at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)
  at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
  at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
  at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103) 
  at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
  at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
  at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
  at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)
  at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)
  at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)
  at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
  at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
  at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
  at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
  at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
  at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
  at java.lang.Thread.run(Thread.java:745)

解决:修改 conf/zeppelin-env.sh export
添加:SPARK_SUBMIT_OPTIONS=”–jars /home/hadoop/spark-2.0.0-bin-hadoop2.6/jars/mysql-connector-java-5.1.11-bin.jar”

你可能感兴趣的:(大数据,zeppelin)