spark sql on yarn 启动失败 ERROR client.TransportClient: Failed to send RPC RPC

19/04/17 02:54:57 ERROR client.TransportClient: Failed to send RPC RPC 7651764253676103503 to /10.169.12.139:45996: java.nio.channels.ClosedChannelException
java.nio.channels.ClosedChannelException
        at io.netty.channel.AbstractChannel$AbstractUnsafe.write(...)(Unknown Source)
19/04/17 02:54:57 ERROR cluster.YarnSchedulerBackend$YarnSchedulerEndpoint: Sending RequestExecutors(0,0,Map(),Set()) to AM was unsuccessful
java.io.IOException: Failed to send RPC RPC 7651764253676103503 to /10.169.12.139:45996: java.nio.channels.ClosedChannelException
        at org.apache.spark.network.client.TransportClient$RpcChannelListener.handleFailure(TransportClient.java:357)
        at org.apache.spark.network.client.TransportClient$StdChannelListener.operationComplete(TransportClient.java:334)
        at io.netty.util.concurrent.DefaultPromise.notifyListener0(DefaultPromise.java:507)
        at io.netty.util.concurrent.DefaultPromise.notifyListenersNow(DefaultPromise.java:481)
 

出现这样的原因主要是 给节点分配的内存少,yarn kill了spark application。

给yarn-site.xml增加配置:


    		yarn.nodemanager.pmem-check-enabled
    		false
	
	
		yarn.nodemanager.vmem-check-enabled
    		false
	

你可能感兴趣的:(spark,YARN,spark-sql,yarn)