flink-connector-kafka 冲突

本文详细解析了一个在Flink使用Kafka Consumer时遇到的NoClassDefFoundError异常,该异常源于未能找到org.apache.kafka.common.serialization.ByteArrayDeserializer类。文章深入探讨了解决此问题的方法,包括检查依赖库冲突和正确配置Flink与Kafka版本兼容性。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >


java.lang.NoClassDefFoundError: org/apache/kafka/common/serialization/ByteArrayDeserializer  
        at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09.setDeserializer(FlinkKafkaConsumer09.java:271)  
        at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer09.<init>(FlinkKafkaConsumer09.java:158)  
        at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer010.<init>(FlinkKafkaConsumer010.java:128)  
        at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer010.<init>(FlinkKafkaConsumer010.java:112)  
        at org.apache.flink.streaming.connectors.kafka.FlinkKafkaConsumer010.<init>(FlinkKafkaConsumer010.java:79)  
        at com.test.boot.FlinkKafkaTest.main(FlinkKafkaTest.java:94)  
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)  
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)  
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)  
        at java.lang.reflect.Method.invoke(Method.java:498)  
        at org.apache.flink.client.program.PackagedProgram.callMainMethod(PackagedProgram.java:528)  
        at org.apache.flink.client.program.PackagedProgram.invokeInteractiveModeForExecution(PackagedProgram.java:419)  
        at org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:381)  
        at org.apache.flink.client.CliFrontend.executeProgram(CliFrontend.java:878)  
        at org.apache.flink.client.CliFrontend.run(CliFrontend.java:259)  
        at org.apache.flink.client.CliFrontend.parseParameters(CliFrontend.java:1126)  
        at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1173)  
        at org.apache.flink.client.CliFrontend$2.call(CliFrontend.java:1170)  
        at org.apache.flink.runtime.security.HadoopSecurityContext$1.run(HadoopSecurityContext.java:43)  
        at java.security.AccessController.doPrivileged(Native Method)  
        at javax.security.auth.Subject.doAs(Subject.java:422)  
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1781)  
        at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:40)  
        at org.apache.flink.client.CliFrontend.main(CliFrontend.java:1169)  
Caused by: java.lang.ClassNotFoundException: org.apache.kafka.common.serialization.ByteArrayDeserializer  
        at java.net.URLClassLoader.findClass(URLClassLoader.java:381)  
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)  
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)  
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)  
        ... 24 more  

本地可以跑,on yarn异常,包冲突了lib检查下

资料1

资料2

我的博客即将同步至腾讯云+社区,邀请大家一同入驻:https://cloud.tencent.com/developer/support-plan?invite_code=21pyvxnxazj4c

转载于:https://my.oschina.net/u/3005325/blog/3001658

2025-03-28 14:36:35,936 WARN org.apache.flink.runtime.taskmanager.Task [] - Source: tran_data_log[1] -> DropUpdateBefore[2] -> ConstraintEnforcer[3] -> kafka_sink[3]: Writer -> kafka_sink[3]: Committer (1/1)#5 (f5c075ff51859f30eba349e73094fdff_cbc357ccb763df2852fee8c4fc7d55f2_0_5) switched from INITIALIZING to FAILED with failure cause: org.apache.kafka.common.KafkaException: Failed to construct kafka producer at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:473) ~[kafka-clients-3.7.0.jar:?] at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:294) ~[kafka-clients-3.7.0.jar:?] at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:321) ~[kafka-clients-3.7.0.jar:?] at org.apache.kafka.clients.producer.KafkaProducer.<init>(KafkaProducer.java:306) ~[kafka-clients-3.7.0.jar:?] at org.apache.flink.connector.kafka.sink.FlinkKafkaInternalProducer.<init>(FlinkKafkaInternalProducer.java:55) ~[flink-connector-kafka-1.17.2.jar:1.17.2] at org.apache.flink.connector.kafka.sink.KafkaWriter.<init>(KafkaWriter.java:182) ~[flink-connector-kafka-1.17.2.jar:1.17.2] at org.apache.flink.connector.kafka.sink.KafkaSink.createWriter(KafkaSink.java:111) ~[flink-connector-kafka-1.17.2.jar:1.17.2] at org.apache.flink.connector.kafka.sink.KafkaSink.createWriter(KafkaSink.java:57) ~[flink-connector-kafka-1.17.2.jar:1.17.2] at org.apache.flink.streaming.runtime.operators.sink.StatefulSinkWriterStateHandler.createWriter(StatefulSinkWriterStateHandler.java:117) ~[flink-dist-1.17.2.jar:1.17.2] at org.apache.flink.streaming.runtime.operators.sink.SinkWriterOperator.initializeState(SinkWriterOperator.java:146) ~[flink-dist-1.17.2.jar:1.17.2] at org.apache.flink.streaming.api.operators.StreamOperatorStateHandler.initializeOperatorState(StreamOperatorStateHandler.java:122) ~[flink-dist-1.17.2.jar:1.17.2] at org.apac
03-29
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值