spark

因为在提交spark任务的时候没有指定节点的内存的大小,采用了默认的配置导致发生了一下的异常:

   (106 + 45) / 200]17/09/15 10:04:46 ERROR client.TransportClient: Failed to send RPC 7807032932563004737 to dn129.avcdata.com/192.168.20.129:40006: java.nio.channels.ClosedChannelException
java.nio.channels.ClosedChannelException
17/09/15 10:04:46 ERROR cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: Sending RequestExecutors(94,0,Map(),Set()) to AM was unsuccessful
java.io.IOException: Failed to send RPC 7807032932563004737 to dn129.avcdata.com/192.168.20.129:40006: java.nio.channels.ClosedChannelException
    at org.apache.spark.network.client.TransportClient$3.operationComplete(TransportClient.java:239)
    at org.apache.spark.network.client.TransportClient$3.operationComplete(TransportClient.java:226)
    at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:680)
    at io.netty.util.concurrent.DefaultPromise.notifyListeners(DefaultPromise.java:567)
    at io.netty.util.concurrent.DefaultPromise.tryFailure(DefaultPromise.java:424)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.safeSetFailure(AbstractChannel.java:801)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.write(AbstractChannel.java:699)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.write(DefaultChannelPipeline.java:1122)
    at io.netty.channel.AbstractChannelHandlerContext.invokeWrite(AbstractChannelHandlerContext.java:633)
    at io.netty.channel.AbstractChannelHandlerContext.access$1900(AbstractChannelHandlerContext.java:32)
    at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.write(AbstractChannelHandlerContext.java:908)
    at io.netty.channel.AbstractChannelHandlerContext$WriteAndFlushTask.write(AbstractChannelHandlerContext.java:960)
    at io.netty.channel.AbstractChannelHandlerContext$AbstractWriteTask.run(AbstractChannelHandlerContext.java:893)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
    at java.lang.Thread.run(Thread.java:745)
Caused by: java.nio.channels.ClosedChannelException

解决办法是给spark 任务配置更大的内存即可解决问题

你可能感兴趣的:(spark)