目录
关键词
现象
Java端报错
Linux端报错
原因
解决
wordCount Demo运行报错关键词:
java.lang.NullPointerException
org.apache.spark.storage.BlockManagerMaster.registerBlockManager
java.io.InvalidClassException
org.apache.spark.rpc.netty.NettyRpcEndpointRef; local class incompatible: stream classdesc serialVersionUID = -4186747031772874359, local class serialVersionUID = 6257082371135760434
scala 版本
spark-streaming 版本
19/11/05 15:06:05 INFO SparkEnv: Registering OutputCommitCoordinator
19/11/05 15:06:06 INFO Utils: Successfully started service 'SparkUI' on port 4040.
19/11/05 15:06:06 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://DD-HP5500:4040
19/11/05 15:06:06 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://axe1:7077...
19/11/05 15:06:06 INFO TransportClientFactory: Successfully created connection to axe1/192.168.86.101:7077 after 46 ms (0 ms spent in bootstraps)
19/11/05 15:06:26 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://axe1:7077...
19/11/05 15:06:46 INFO StandaloneAppClient$ClientEndpoint: Connecting to master spark://axe1:7077...
19/11/05 15:07:06 ERROR StandaloneSchedulerBackend: Application has been killed. Reason: All masters are unresponsive! Giving up.
19/11/05 15:07:06 WARN StandaloneSchedulerBackend: Application ID is not initialized yet.
19/11/05 15:07:06 INFO SparkUI: Stopped Spark web UI at http://DD-HP5500:4040
19/11/05 15:07:06 INFO StandaloneSchedulerBackend: Shutting down all executors
19/11/05 15:07:06 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 56271.
19/11/05 15:07:06 INFO NettyBlockTransferService: Server created on DD-HP5500:56271
19/11/05 15:07:06 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
19/11/05 15:07:06 INFO CoarseGrainedSchedulerBackend$DriverEndpoint: Asking each executor to shut down
19/11/05 15:07:06 WARN StandaloneAppClient$ClientEndpoint: Drop UnregisterApplication(null) because has not yet connected to master
19/11/05 15:07:06 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
19/11/05 15:07:06 INFO MemoryStore: MemoryStore cleared
19/11/05 15:07:06 INFO BlockManager: BlockManager stopped
19/11/05 15:07:06 INFO BlockManagerMaster: BlockManagerMaster stopped
19/11/05 15:07:06 WARN MetricsSystem: Stopping a MetricsSystem that is not running
19/11/05 15:07:06 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
19/11/05 15:07:06 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, DD-HP5500, 56271, None)
19/11/05 15:07:06 ERROR SparkContext: Error initializing SparkContext.
java.lang.NullPointerException
at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:64)
at org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:252)
at org.apache.spark.SparkContext.(SparkContext.scala:510)
at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:838)
at org.apache.spark.streaming.StreamingContext.(StreamingContext.scala:85)
at org.apache.spark.streaming.api.java.JavaStreamingContext.(JavaStreamingContext.scala:138)
at com.zzj.bigdata.sparkdemo.Test.main(Test.java:27)
19/11/05 15:07:06 INFO SparkContext: Successfully stopped SparkContext
19/11/05 15:07:06 INFO SparkContext: SparkContext already stopped.
Exception in thread "main" java.lang.NullPointerException
at org.apache.spark.storage.BlockManagerMaster.registerBlockManager(BlockManagerMaster.scala:64)
at org.apache.spark.storage.BlockManager.initialize(BlockManager.scala:252)
at org.apache.spark.SparkContext.(SparkContext.scala:510)
at org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:838)
at org.apache.spark.streaming.StreamingContext.(StreamingContext.scala:85)
at org.apache.spark.streaming.api.java.JavaStreamingContext.(JavaStreamingContext.scala:138)
at com.zzj.bigdata.sparkdemo.Test.main(Test.java:27)
19/11/05 15:07:06 INFO ShutdownHookManager: Shutdown hook called
19/11/05 15:07:06 INFO ShutdownHookManager: Deleting directory C:\Users\86006\AppData\Local\Temp\spark-254491e8-3ca7-41a6-b738-29f65a5088a3
【NettyRpcEndpointRef; local class incompatible: stream classdesc serialVersionUID = -4186747031772874359, local class serialVersionUID = 6257082371135760434】
19/11/05 15:06:46 ERROR TransportRequestHandler: Error while invoking RpcHandler#receive() for one-way message.
java.io.InvalidClassException: org.apache.spark.rpc.netty.NettyRpcEndpointRef; local class incompatible: stream classdesc serialVersionUID = -4186747031772874359, local class serialVersionUID = 6257082371135760434
at java.io.ObjectStreamClass.initNonProxy(ObjectStreamClass.java:699)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1885)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1751)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2042)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2287)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2211)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2069)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1573)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:75)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:108)
at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1$$anonfun$apply$1.apply(NettyRpcEnv.scala:271)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:320)
at org.apache.spark.rpc.netty.NettyRpcEnv$$anonfun$deserialize$1.apply(NettyRpcEnv.scala:270)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at org.apache.spark.rpc.netty.NettyRpcEnv.deserialize(NettyRpcEnv.scala:269)
at org.apache.spark.rpc.netty.RequestMessage$.apply(NettyRpcEnv.scala:611)
at org.apache.spark.rpc.netty.NettyRpcHandler.internalReceive(NettyRpcEnv.scala:662)
at org.apache.spark.rpc.netty.NettyRpcHandler.receive(NettyRpcEnv.scala:654)
at org.apache.spark.network.server.TransportRequestHandler.processOneWayMessage(TransportRequestHandler.java:274)
at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:105)
at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1359)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:935)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:138)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
at java.lang.Thread.run(Thread.java:748)
19/11/05 15:07:06 INFO Master: 192.168.86.1:56201 got disassociated, removing it.
19/11/05 15:07:06 INFO Master: DD-HP5500:56190 got disassociated, removing it.
经查,序列化和反序列化的时候,因为双方使用的scala的版本不一致,导致serialVersionUID不一致,最后在反序列化解析数据的时候报错。
查看linux服务器的scala版本,是2.11.12,Scala version 2.11.12
[root@axe1 bin]# ./spark-shell
19/11/05 22:45:11 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://axe1:4040
Spark context available as 'sc' (master = local[*], app id = local-1572965130207).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.4.4
/_/
Using Scala version 2.11.12 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_201)
Type in expressions to have them evaluated.
Type :help for more information.
scala>
查看java端使用的maven依赖,是2.12
org.apache.spark
spark-streaming_2.12
2.4.4
替换成2.11后,程序可以正常使用
org.apache.spark
spark-streaming_2.11
2.4.4