Kafka-2.11-0.11.0.0对接spark streaming序列化问题

本文介绍了一个在使用Kafka 2.11-0.11.0.0和Spark Streaming集成过程中遇到的非序列化错误,并提供了通过配置Kryo序列化器来解决此问题的方法。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

Kafka_2.11-0.11.0.0

sprak-streaming-kafka-0-10_2.11

报错信息如下

java.io.NotSerializableException: org.apache.kafka.clients.consumer.ConsumerRecord
Serialization stack:
	- object not serializable (class: org.apache.kafka.clients.consumer.ConsumerRecord, value: ConsumerRecord(topic = news, partition = 0, offset = 115900, CreateTime = 1548486965892, checksum = 3320474937, serialized key size = -1, serialized value size = 51, key = null, value = 2019-01-26 1548486965891 911 550 entertainment view))
	- element of array (index: 0)
	- array (class [Lorg.apache.kafka.clients.consumer.ConsumerRecord;, size 11)
	at org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:40)
	at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:46)
	at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:100)
	at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:450)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
	at java.lang.Thread.run(Thread.java:748)

解决方法

创建SparkContext时设置一个属性
set("spark.serializer","org.apache.spark.serializer.KryoSerializer")

val sparkConf = new SparkConf().setAppName("KafkaReceiver")
                .set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
                .setMaster("local[3]")

 

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值