解决sparkstreaming读取kafka中的json数据,消费后保存到MySQL中,报_corrupt_record和name错误的!!
所用软件版本:spark2.3.0IDEA2019.1kafka_2.11-01.0.2.2spark-streaming-kafka-0-10_2.11-2.3.0先贴出代码:packagecom.bd.sparkimportjava.util.Propertiesimportorg.apache.kafka.clients.consumer.ConsumerRecordimportorg.ap