Kafka-2.11-0.11.0.0对接spark streaming序列化问题

Kafka_2.11-0.11.0.0

sprak-streaming-kafka-0-10_2.11

报错信息如下

java.io.NotSerializableException: org.apache.kafka.clients.consumer.ConsumerRecord
Serialization stack:
	- object not serializable (class: org.apache.kafka.clients.consumer.ConsumerRecord, value: ConsumerRecord(topic = news, partition = 0, offset = 115900, CreateTime = 1548486965892, checksum = 3320474937, serialized key size = -1, serialized value size = 51, key = null, value = 2019-01-26 1548486965891 911 550 entertainment view))
	- element of array (index: 0)
	- array (class [Lorg.apache.kafka.clients.consumer.ConsumerRecord;, size 11)
	at org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:40)
	at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:46)
	at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:100)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:450)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

解决方法

创建SparkContext时设置一个属性
set("spark.serializer","org.apache.spark.serializer.KryoSerializer")

val sparkConf = new SparkConf().setAppName("KafkaReceiver")
                .set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
                .setMaster("local[3]")

 

你可能感兴趣的:(Spark)