spark on yarn 异常记录二

通过spark-submit 向集群提交任务时,–num-executors 数量大于1即会出现异常,异常如下:

17/06/04 11:47:03 WARN DefaultChannelPipeline: An exception 'java.lang.NoSuchMethodError: org.apache.spark.network.client.TransportClient.getChannel()Lio/netty/channel/Channel;' [enable DEBUG level for full stacktrace] was thrown by a user handler's exceptionCaught() method while handling the following exception:
java.lang.NoSuchMethodError: org.apache.spark.network.client.TransportClient.getChannel()Lio/netty/channel/Channel;
    at org.apache.spark.rpc.netty.NettyRpcHandler.channelInactive(NettyRpcEnv.scala:621)
    at org.apache.spark.network.server.TransportRequestHandler.channelInactive(TransportRequestHandler.java:99)
    at org.apache.spark.network.server.TransportChannelHandler.channelInactive(TransportChannelHandler.java:103)
    at org.spark_project.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:251)
    at org.spark_project.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:237)
    at org.spark_project.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:230)
    at org.spark_project.io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
    at org.spark_project.io.netty.handler.timeout.IdleStateHandler.channelInactive(IdleStateHandler.java:257)
    at org.spark_project.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:251)
    at org.spark_project.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:237)
    at org.spark_project.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:230)
    at org.spark_project.io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
    at org.spark_project.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:251)
    at org.spark_project.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:237)
    at org.spark_project.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:230)
    at org.spark_project.io.netty.channel.ChannelInboundHandlerAdapter.channelInactive(ChannelInboundHandlerAdapter.java:75)
    at org.apache.spark.network.util.TransportFrameDecoder.channelInactive(TransportFrameDecoder.java:182)
    at org.spark_project.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:251)
    at org.spark_project.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:237)
    at org.spark_project.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:230)
    at org.spark_project.io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1289)
    at org.spark_project.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:251)
    at org.spark_project.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:237)
    at org.spark_project.io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:893)
    at org.spark_project.io.netty.channel.AbstractChannel$AbstractUnsafe$7.run(AbstractChannel.java:691)
    at org.spark_project.io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:376)
    at org.spark_project.io.netty.util.concurrent.SingleThreadEventExecutor.confirmShutdown(SingleThreadEventExecutor.java:680)
    at org.spark_project.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:465)
    at org.spark_project.io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:140)
    at org.spark_project.io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:144)
    at java.lang.Thread.run(Thread.java:745)
17/06/04 11:47:03 INFO ShutdownHookManager: Shutdown hook called

google半天,有很多说法,但是依旧没有解决我的问题
参考 jira中各种方法,问题并没有得到解决。想着,异常原因是方法找不到,那么我把这个类单独提取出来,放到我要提交的jar中,会不会得到解决,毕竟以前遇到过这种情况,就是这么解决的。试了试,问题并没有得到解决。暂时先记着,等请教几个大牛,再来更新……

你可能感兴趣的:(hadoop,spark)