Kafka: compile source code

本文记录了在编译Kafka 0.8.1.1版本源码过程中遇到的签名任务无法执行的问题及相应的解决尝试。在执行gradlew build命令时出现了错误提示,进一步尝试排除了签名配置缺失导致的任务执行失败。

#tar xzf kafka-0.8.1.1-src.tgz

#cd kafka-0.8.1.1-src

#./gradlew build

 

error:Cannot perform signing task ':clients:signArchives' because it has no configured signat

 

#./gradlew releaseTarGzAll -x signArchives

oscar@oscardeMacBook-Pro bz-sport-realtime % mvn dependency:tree [INFO] Scanning for projects... [INFO] [INFO] --------------------< com.sport:bz-sport-realtime >--------------------- [INFO] Building bz-sport-realtime 1.0-SNAPSHOT [INFO] from pom.xml [INFO] --------------------------------[ jar ]--------------------------------- [WARNING] 2 problems were encountered while building the effective model for org.apache.yetus:audience-annotations:jar:0.5.0 during dependency collection step for project (use -X to see details) [INFO] [INFO] --- dependency:3.7.0:tree (default-cli) @ bz-sport-realtime --- [INFO] com.sport:bz-sport-realtime:jar:1.0-SNAPSHOT [INFO] +- org.apache.flink:flink-java:jar:1.20.2:compile [INFO] | +- org.apache.commons:commons-lang3:jar:3.12.0:compile [INFO] | +- org.apache.commons:commons-math3:jar:3.6.1:compile [INFO] | +- com.twitter:chill-java:jar:0.7.6:compile [INFO] | +- org.slf4j:slf4j-api:jar:1.7.36:compile [INFO] | \- com.google.code.findbugs:jsr305:jar:1.3.9:compile [INFO] +- org.apache.flink:flink-core:jar:1.20.2:compile [INFO] | +- org.apache.flink:flink-core-api:jar:1.20.2:compile [INFO] | | \- org.apache.flink:flink-metrics-core:jar:1.20.2:compile [INFO] | +- org.apache.flink:flink-annotations:jar:1.20.2:compile [INFO] | +- org.apache.flink:flink-shaded-asm-9:jar:9.5-17.0:compile [INFO] | +- org.apache.flink:flink-shaded-jackson:jar:2.14.2-17.0:compile [INFO] | +- org.snakeyaml:snakeyaml-engine:jar:2.6:compile [INFO] | +- org.apache.commons:commons-text:jar:1.10.0:compile [INFO] | +- com.esotericsoftware.kryo:kryo:jar:2.24.0:compile [INFO] | | +- com.esotericsoftware.minlog:minlog:jar:1.2:compile [INFO] | | \- org.objenesis:objenesis:jar:2.1:compile [INFO] | +- commons-collections:commons-collections:jar:3.2.2:compile [INFO] | +- org.apache.commons:commons-compress:jar:1.26.0:compile [INFO] | \- org.apache.flink:flink-shaded-guava:jar:31.1-jre-17.0:compile [INFO] +- org.apache.flink:flink-connector-base:jar:1.20.2:compile [INFO] +- org.apache.flink:flink-streaming-java:jar:1.20.2:compile [INFO] | +- org.apache.flink:flink-file-sink-common:jar:1.20.2:compile [INFO] | +- org.apache.flink:flink-runtime:jar:1.20.2:compile [INFO] | | +- org.apache.flink:flink-rpc-core:jar:1.20.2:compile [INFO] | | +- org.apache.flink:flink-rpc-akka-loader:jar:1.20.2:compile [INFO] | | +- org.apache.flink:flink-queryable-state-client-java:jar:1.20.2:compile [INFO] | | +- org.apache.flink:flink-hadoop-fs:jar:1.20.2:compile [INFO] | | +- org.apache.flink:flink-shaded-zookeeper-3:jar:3.7.1-17.0:compile [INFO] | | +- org.javassist:javassist:jar:3.24.0-GA:compile [INFO] | | \- tools.profiler:async-profiler:jar:2.9:compile [INFO] | \- org.apache.flink:flink-connector-datagen:jar:1.20.2:compile [INFO] +- org.apache.flink:flink-clients:jar:1.20.2:compile [INFO] | +- org.apache.flink:flink-optimizer:jar:1.20.2:compile [INFO] | +- commons-cli:commons-cli:jar:1.5.0:compile [INFO] | \- org.apache.flink:flink-datastream:jar:1.20.2:compile [INFO] | \- org.apache.flink:flink-datastream-api:jar:1.20.2:compile [INFO] +- org.apache.flink:flink-runtime-web:jar:1.20.2:compile [INFO] | \- org.apache.flink:flink-shaded-netty:jar:4.1.91.Final-17.0:compile [INFO] +- org.apache.flink:flink-connector-kafka:jar:3.4.0-1.20:compile [INFO] | +- com.google.guava:guava:jar:32.1.2-jre:compile [INFO] | | +- com.google.guava:failureaccess:jar:1.0.1:compile [INFO] | | +- com.google.guava:listenablefuture:jar:9999.0-empty-to-avoid-conflict-with-guava:compile [INFO] | | +- org.checkerframework:checker-qual:jar:3.33.0:compile [INFO] | | +- com.google.errorprone:error_prone_annotations:jar:2.18.0:compile [INFO] | | \- com.google.j2objc:j2objc-annotations:jar:2.8:compile [INFO] | +- com.fasterxml.jackson.core:jackson-core:jar:2.15.2:compile [INFO] | +- com.fasterxml.jackson.core:jackson-databind:jar:2.15.2:compile [INFO] | +- com.fasterxml.jackson.datatype:jackson-datatype-jsr310:jar:2.15.2:compile [INFO] | \- com.fasterxml.jackson.datatype:jackson-datatype-jdk8:jar:2.15.2:compile [INFO] +- org.apache.flink:flink-table-api-java:jar:1.20.2:compile [INFO] +- org.apache.flink:flink-table-api-java-bridge:jar:1.20.2:provided [INFO] | \- org.apache.flink:flink-table-api-bridge-base:jar:1.20.2:provided [INFO] +- org.apache.flink:flink-table-planner_2.12:jar:1.20.2:compile [INFO] | +- org.immutables:value:jar:2.8.8:compile [INFO] | +- org.immutables:value-annotations:jar:2.8.8:compile [INFO] | +- org.codehaus.janino:commons-compiler:jar:3.1.10:compile [INFO] | +- org.codehaus.janino:janino:jar:3.1.10:compile [INFO] | +- org.apache.flink:flink-scala_2.12:jar:1.20.2:compile [INFO] | | +- org.scala-lang:scala-reflect:jar:2.12.7:compile [INFO] | | +- org.scala-lang:scala-library:jar:2.12.7:compile [INFO] | | +- org.scala-lang:scala-compiler:jar:2.12.7:compile [INFO] | | | \- org.scala-lang.modules:scala-xml_2.12:jar:1.0.6:compile [INFO] | | \- com.twitter:chill_2.12:jar:0.7.6:compile [INFO] | \- org.apache.flink:flink-table-runtime:jar:1.20.2:compile [INFO] | \- org.apache.flink:flink-cep:jar:1.20.2:compile [INFO] +- org.apache.flink:flink-table-common:jar:1.20.2:compile [INFO] | \- com.ibm.icu:icu4j:jar:67.1:compile [INFO] +- org.apache.doris:flink-doris-connector-1.16:jar:25.1.0:compile [INFO] +- org.apache.flink:flink-connector-jdbc:jar:3.3.0-1.20:compile [INFO] | +- org.apache.flink:flink-connector-jdbc-core:jar:3.3.0-1.20:compile [INFO] | +- org.apache.flink:flink-connector-jdbc-cratedb:jar:3.3.0-1.20:compile [INFO] | +- org.apache.flink:flink-connector-jdbc-db2:jar:3.3.0-1.20:compile [INFO] | +- org.apache.flink:flink-connector-jdbc-mysql:jar:3.3.0-1.20:compile [INFO] | +- org.apache.flink:flink-connector-jdbc-oceanbase:jar:3.3.0-1.20:compile [INFO] | +- org.apache.flink:flink-connector-jdbc-oracle:jar:3.3.0-1.20:compile [INFO] | +- org.apache.flink:flink-connector-jdbc-postgres:jar:3.3.0-1.20:compile [INFO] | +- org.apache.flink:flink-connector-jdbc-sqlserver:jar:3.3.0-1.20:compile [INFO] | \- org.apache.flink:flink-connector-jdbc-trino:jar:3.3.0-1.20:compile [INFO] +- org.apache.flink:flink-connector-mysql-cdc:jar:3.4.0:compile [INFO] | +- org.apache.flink:flink-connector-debezium:jar:3.4.0:compile [INFO] | | +- io.debezium:debezium-api:jar:1.9.8.Final:compile [INFO] | | \- io.debezium:debezium-embedded:jar:1.9.8.Final:compile [INFO] | | +- org.apache.kafka:connect-api:jar:3.2.0:compile [INFO] | | | \- javax.ws.rs:javax.ws.rs-api:jar:2.1.1:runtime [INFO] | | +- org.apache.kafka:connect-runtime:jar:3.2.0:compile [INFO] | | | +- org.apache.kafka:connect-transforms:jar:3.2.0:compile [INFO] | | | +- org.apache.kafka:kafka-tools:jar:3.2.0:runtime [INFO] | | | | \- net.sourceforge.argparse4j:argparse4j:jar:0.7.0:runtime [INFO] | | | +- org.bitbucket.b_c:jose4j:jar:0.7.9:runtime [INFO] | | | +- com.fasterxml.jackson.jaxrs:jackson-jaxrs-json-provider:jar:2.12.6:runtime [INFO] | | | | +- com.fasterxml.jackson.jaxrs:jackson-jaxrs-base:jar:2.12.6:runtime [INFO] | | | | \- com.fasterxml.jackson.module:jackson-module-jaxb-annotations:jar:2.12.6:runtime [INFO] | | | | \- jakarta.xml.bind:jakarta.xml.bind-api:jar:2.3.2:runtime [INFO] | | | +- org.glassfish.jersey.containers:jersey-container-servlet:jar:2.34:runtime [INFO] | | | | +- org.glassfish.jersey.containers:jersey-container-servlet-core:jar:2.34:runtime [INFO] | | | | | \- org.glassfish.hk2.external:jakarta.inject:jar:2.6.1:runtime [INFO] | | | | \- jakarta.ws.rs:jakarta.ws.rs-api:jar:2.1.6:runtime [INFO] | | | +- org.glassfish.jersey.inject:jersey-hk2:jar:2.34:runtime [INFO] | | | | \- org.glassfish.hk2:hk2-locator:jar:2.6.1:runtime [INFO] | | | | +- org.glassfish.hk2.external:aopalliance-repackaged:jar:2.6.1:runtime [INFO] | | | | +- org.glassfish.hk2:hk2-api:jar:2.6.1:runtime [INFO] | | | | \- org.glassfish.hk2:hk2-utils:jar:2.6.1:runtime [INFO] | | | +- javax.activation:activation:jar:1.1.1:compile [INFO] | | | +- org.eclipse.jetty:jetty-servlets:jar:9.4.44.v20210927:runtime [INFO] | | | | \- org.eclipse.jetty:jetty-continuation:jar:9.4.44.v20210927:runtime [INFO] | | | +- org.eclipse.jetty:jetty-client:jar:9.4.44.v20210927:runtime [INFO] | | | +- org.reflections:reflections:jar:0.9.12:runtime [INFO] | | | \- org.apache.maven:maven-artifact:jar:3.8.4:runtime [INFO] | | | \- org.codehaus.plexus:plexus-utils:jar:3.3.0:runtime [INFO] | | +- org.apache.kafka:connect-json:jar:3.2.0:compile [INFO] | | \- org.apache.kafka:connect-file:jar:3.2.0:compile [INFO] | +- io.debezium:debezium-connector-mysql:jar:1.9.8.Final:compile [INFO] | | +- io.debezium:debezium-core:jar:1.9.8.Final:compile [INFO] | | +- io.debezium:debezium-ddl-parser:jar:1.9.8.Final:compile [INFO] | | | \- org.antlr:antlr4-runtime:jar:4.8:compile [INFO] | | +- com.zendesk:mysql-binlog-connector-java:jar:0.27.2:compile [INFO] | | \- mysql:mysql-connector-java:jar:8.0.28:compile [INFO] | +- org.apache.flink:flink-cdc-common:jar:3.4.0:compile [INFO] | +- com.esri.geometry:esri-geometry-api:jar:2.2.0:compile [INFO] | +- com.zaxxer:HikariCP:jar:4.0.3:compile [INFO] | \- org.apache.flink:flink-cdc-runtime:jar:3.4.0:compile [INFO] +- com.mysql:mysql-connector-j:jar:8.0.33:compile [INFO] | \- com.google.protobuf:protobuf-java:jar:3.21.9:compile [INFO] +- org.apache.bahir:flink-connector-redis_2.12:jar:1.1.0:compile [INFO] | +- org.apache.flink:flink-streaming-java_2.12:jar:1.14.5:compile [INFO] | | \- org.apache.flink:flink-shaded-force-shading:jar:14.0:compile [INFO] | \- org.apache.flink:flink-table-api-java-bridge_2.12:jar:1.14.5:compile [INFO] +- redis.clients:jedis:jar:5.2.0:compile [INFO] | +- org.apache.commons:commons-pool2:jar:2.12.0:compile [INFO] | +- org.json:json:jar:20240303:compile [INFO] | \- com.google.code.gson:gson:jar:2.11.0:compile [INFO] +- org.apache.httpcomponents:httpclient:jar:4.5.14:compile [INFO] | +- org.apache.httpcomponents:httpcore:jar:4.4.16:compile [INFO] | +- commons-logging:commons-logging:jar:1.2:compile [INFO] | \- commons-codec:commons-codec:jar:1.11:compile [INFO] +- com.alibaba.fastjson2:fastjson2:jar:2.0.53:compile [INFO] +- org.apache.flink:flink-json:jar:1.20.2:compile [INFO] +- org.projectlombok:lombok:jar:1.18.30:provided [INFO] +- org.apache.hadoop:hadoop-common:jar:3.3.6:compile [INFO] | +- org.apache.hadoop.thirdparty:hadoop-shaded-protobuf_3_7:jar:1.1.1:compile [INFO] | +- org.apache.hadoop:hadoop-annotations:jar:3.3.6:compile [INFO] | +- org.apache.hadoop.thirdparty:hadoop-shaded-guava:jar:1.1.1:compile [INFO] | +- commons-io:commons-io:jar:2.8.0:compile [INFO] | +- commons-net:commons-net:jar:3.9.0:compile [INFO] | +- javax.servlet:javax.servlet-api:jar:3.1.0:compile [INFO] | +- jakarta.activation:jakarta.activation-api:jar:1.2.1:runtime [INFO] | +- org.eclipse.jetty:jetty-server:jar:9.4.51.v20230217:compile [INFO] | | +- org.eclipse.jetty:jetty-http:jar:9.4.51.v20230217:compile [INFO] | | \- org.eclipse.jetty:jetty-io:jar:9.4.51.v20230217:compile [INFO] | +- org.eclipse.jetty:jetty-util:jar:9.4.51.v20230217:compile [INFO] | +- org.eclipse.jetty:jetty-servlet:jar:9.4.51.v20230217:compile [INFO] | | +- org.eclipse.jetty:jetty-security:jar:9.4.51.v20230217:compile [INFO] | | \- org.eclipse.jetty:jetty-util-ajax:jar:9.4.51.v20230217:compile [INFO] | +- org.eclipse.jetty:jetty-webapp:jar:9.4.51.v20230217:compile [INFO] | | \- org.eclipse.jetty:jetty-xml:jar:9.4.51.v20230217:compile [INFO] | +- javax.servlet.jsp:jsp-api:jar:2.1:runtime [INFO] | +- com.sun.jersey:jersey-core:jar:1.19.4:compile [INFO] | | \- javax.ws.rs:jsr311-api:jar:1.1.1:compile [INFO] | +- com.sun.jersey:jersey-servlet:jar:1.19.4:compile [INFO] | +- com.github.pjfanning:jersey-json:jar:1.20:compile [INFO] | | +- org.codehaus.jettison:jettison:jar:1.1:compile [INFO] | | \- com.sun.xml.bind:jaxb-impl:jar:2.2.3-1:compile [INFO] | | \- javax.xml.bind:jaxb-api:jar:2.2.2:compile [INFO] | | \- javax.xml.stream:stax-api:jar:1.0-2:compile [INFO] | +- com.sun.jersey:jersey-server:jar:1.19.4:compile [INFO] | +- ch.qos.reload4j:reload4j:jar:1.2.22:compile [INFO] | +- commons-beanutils:commons-beanutils:jar:1.9.4:compile [INFO] | +- org.apache.commons:commons-configuration2:jar:2.8.0:compile [INFO] | +- org.slf4j:slf4j-reload4j:jar:1.7.36:compile [INFO] | +- org.apache.avro:avro:jar:1.7.7:compile [INFO] | | +- org.codehaus.jackson:jackson-core-asl:jar:1.9.13:compile [INFO] | | +- org.codehaus.jackson:jackson-mapper-asl:jar:1.9.13:compile [INFO] | | \- com.thoughtworks.paranamer:paranamer:jar:2.3:compile [INFO] | +- com.google.re2j:re2j:jar:1.1:compile [INFO] | +- org.apache.hadoop:hadoop-auth:jar:3.3.6:compile [INFO] | | +- com.nimbusds:nimbus-jose-jwt:jar:9.8.1:compile [INFO] | | | \- com.github.stephenc.jcip:jcip-annotations:jar:1.0-1:compile [INFO] | | +- org.apache.curator:curator-framework:jar:5.2.0:compile [INFO] | | \- org.apache.kerby:kerb-simplekdc:jar:1.0.1:compile [INFO] | | +- org.apache.kerby:kerb-client:jar:1.0.1:compile [INFO] | | | +- org.apache.kerby:kerby-config:jar:1.0.1:compile [INFO] | | | +- org.apache.kerby:kerb-common:jar:1.0.1:compile [INFO] | | | | \- org.apache.kerby:kerb-crypto:jar:1.0.1:compile [INFO] | | | +- org.apache.kerby:kerb-util:jar:1.0.1:compile [INFO] | | | \- org.apache.kerby:token-provider:jar:1.0.1:compile [INFO] | | \- org.apache.kerby:kerb-admin:jar:1.0.1:compile [INFO] | | +- org.apache.kerby:kerb-server:jar:1.0.1:compile [INFO] | | | \- org.apache.kerby:kerb-identity:jar:1.0.1:compile [INFO] | | \- org.apache.kerby:kerby-xdr:jar:1.0.1:compile [INFO] | +- com.jcraft:jsch:jar:0.1.55:compile [INFO] | +- org.apache.curator:curator-client:jar:5.2.0:compile [INFO] | +- org.apache.curator:curator-recipes:jar:5.2.0:compile [INFO] | +- org.apache.zookeeper:zookeeper:jar:3.6.3:compile [INFO] | | +- org.apache.zookeeper:zookeeper-jute:jar:3.6.3:compile [INFO] | | +- org.apache.yetus:audience-annotations:jar:0.5.0:compile [INFO] | | +- io.netty:netty-handler:jar:4.1.63.Final:compile [INFO] | | | +- io.netty:netty-common:jar:4.1.63.Final:compile [INFO] | | | +- io.netty:netty-resolver:jar:4.1.63.Final:compile [INFO] | | | +- io.netty:netty-buffer:jar:4.1.63.Final:compile [INFO] | | | +- io.netty:netty-transport:jar:4.1.63.Final:compile [INFO] | | | \- io.netty:netty-codec:jar:4.1.63.Final:compile [INFO] | | +- io.netty:netty-transport-native-epoll:jar:4.1.63.Final:compile [INFO] | | | \- io.netty:netty-transport-native-unix-common:jar:4.1.63.Final:compile [INFO] | | +- org.slf4j:slf4j-log4j12:jar:1.7.25:compile [INFO] | | \- log4j:log4j:jar:1.2.17:compile [INFO] | +- io.dropwizard.metrics:metrics-core:jar:3.2.4:compile [INFO] | +- org.apache.kerby:kerb-core:jar:1.0.1:compile [INFO] | | \- org.apache.kerby:kerby-pkix:jar:1.0.1:compile [INFO] | | +- org.apache.kerby:kerby-asn1:jar:1.0.1:compile [INFO] | | \- org.apache.kerby:kerby-util:jar:1.0.1:compile [INFO] | +- org.codehaus.woodstox:stax2-api:jar:4.2.1:compile [INFO] | +- com.fasterxml.woodstox:woodstox-core:jar:5.4.0:compile [INFO] | +- dnsjava:dnsjava:jar:2.1.7:compile [INFO] | \- org.xerial.snappy:snappy-java:jar:1.1.8.2:compile [INFO] +- org.apache.hadoop:hadoop-hdfs-client:jar:3.3.6:compile [INFO] | +- com.squareup.okhttp3:okhttp:jar:4.9.3:compile [INFO] | | \- com.squareup.okio:okio:jar:2.8.0:compile [INFO] | +- org.jetbrains.kotlin:kotlin-stdlib:jar:1.4.10:compile [INFO] | +- org.jetbrains.kotlin:kotlin-stdlib-common:jar:1.4.10:compile [INFO] | \- com.fasterxml.jackson.core:jackson-annotations:jar:2.12.7:compile [INFO] \- org.apache.kafka:kafka-clients:jar:3.6.0:compile [INFO] +- com.github.luben:zstd-jni:jar:1.5.5-1:compile [INFO] \- org.lz4:lz4-java:jar:1.8.0:compile [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 0.779 s [INFO] Finished at: 2025-08-27T18:32:41+08:00 [INFO] ------------------------------------------------------------------------ oscar@oscardeMacBook-Pro bz-sport-realtime % 我的树是这样,那么你说的依赖冲突不存在,为什么还是会报我本地idea上flink任务可以正常消费Kafka,但是上yarn集群就报如下错误: org.apache.kafka.common.KafkaException: Failed to construct kafka consumer at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:830) ~[bz-sport-realtime-1.0-SNAPSHOT.jar:?] at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:665) ~[bz-sport-realtime-1.0-SNAPSHOT.jar:?] at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:646) ~[bz-sport-realtime-1.0-SNAPSHOT.jar:?] at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:626) ~[bz-sport-realtime-1.0-SNAPSHOT.jar:?] at org.apache.flink.connector.kafka.source.reader.KafkaPartitionSplitReader.<init>(KafkaPartitionSplitReader.java:97) ~[bz-sport-realtime-1.0-SNAPSHOT.jar:?] at org.apache.flink.connector.kafka.source.KafkaSource.lambda$createReader$1(KafkaSource.java:185) ~[bz-sport-realtime-1.0-SNAPSHOT.jar:?] at org.apache.flink.connector.base.source.reader.fetcher.SplitFetcherManager.createSplitFetcher(SplitFetcherManager.java:259) ~[bz-sport-realtime-1.0-SNAPSHOT.jar:?] at org.apache.flink.connector.base.source.reader.fetcher.SingleThreadFetcherManager.addSplits(SingleThreadFetcherManager.java:148) ~[bz-sport-realtime-1.0-SNAPSHOT.jar:?] at org.apache.flink.connector.base.source.reader.SourceReaderBase.addSplits(SourceReaderBase.java:315) ~[bz-sport-realtime-1.0-SNAPSHOT.jar:?] at org.apache.flink.streaming.api.operators.SourceOperator.handleAddSplitsEvent(SourceOperator.java:626) ~[bz-sport-realtime-1.0-SNAPSHOT.jar:?] at org.apache.flink.streaming.api.operators.SourceOperator.handleOperatorEvent(SourceOperator.java:596) ~[bz-sport-realtime-1.0-SNAPSHOT.jar:?] at org.apache.flink.streaming.runtime.tasks.OperatorEventDispatcherImpl.dispatchEventToHandlers(OperatorEventDispatcherImpl.java:72) ~[bz-sport-realtime-1.0-SNAPSHOT.jar:?] at org.apache.flink.streaming.runtime.tasks.RegularOperatorChain.dispatchOperatorEvent(RegularOperatorChain.java:80) ~[bz-sport-realtime-1.0-SNAPSHOT.jar:?] at org.apache.flink.streaming.runtime.tasks.StreamTask.lambda$dispatchOperatorEvent$24(StreamTask.java:1609) ~[bz-sport-realtime-1.0-SNAPSHOT.jar:?] at org.apache.flink.streaming.runtime.tasks.StreamTaskActionExecutor$1.runThrowing(StreamTaskActionExecutor.java:50) ~[bz-sport-realtime-1.0-SNAPSHOT.jar:?] at org.apache.flink.streaming.runtime.tasks.mailbox.Mail.run(Mail.java:101) ~[bz-sport-realtime-1.0-SNAPSHOT.jar:?] at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMail(MailboxProcessor.java:414) ~[bz-sport-realtime-1.0-SNAPSHOT.jar:?] at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.processMailsWhenDefaultActionUnavailable(MailboxProcessor.java:383) ~[bz-sport-realtime-1.0-SNAPSHOT.jar:?] at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.processMail(MailboxProcessor.java:368) ~[bz-sport-realtime-1.0-SNAPSHOT.jar:?] at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:229) ~[bz-sport-realtime-1.0-SNAPSHOT.jar:?] at org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:973) ~[bz-sport-realtime-1.0-SNAPSHOT.jar:?] at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:917) ~[bz-sport-realtime-1.0-SNAPSHOT.jar:?] at org.apache.flink.runtime.taskmanager.Task.runWithSystemExitMonitoring(Task.java:970) ~[bz-sport-realtime-1.0-SNAPSHOT.jar:?] at org.apache.flink.runtime.taskmanager.Task.restoreAndInvoke(Task.java:949) [bz-sport-realtime-1.0-SNAPSHOT.jar:?] at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:763) [bz-sport-realtime-1.0-SNAPSHOT.jar:?] at org.apache.flink.runtime.taskmanager.Task.run(Task.java:575) [bz-sport-realtime-1.0-SNAPSHOT.jar:?] at java.lang.Thread.run(Thread.java:750) [?:1.8.0_451] Caused by: org.apache.kafka.common.KafkaException: class org.apache.kafka.common.serialization.ByteArrayDeserializer is not an instance of org.apache.kafka.common.serialization.Deserializer at org.apache.kafka.common.config.AbstractConfig.getConfiguredInstance(AbstractConfig.java:405) ~[bz-sport-realtime-1.0-SNAPSHOT.jar:?] at org.apache.kafka.common.config.AbstractConfig.getConfiguredInstance(AbstractConfig.java:436) ~[bz-sport-realtime-1.0-SNAPSHOT.jar:?] at org.apache.kafka.common.config.AbstractConfig.getConfiguredInstance(AbstractConfig.java:421) ~[bz-sport-realtime-1.0-SNAPSHOT.jar:?] at org.apache.kafka.clients.consumer.KafkaConsumer.<init>(KafkaConsumer.java:709) ~[bz-sport-realtime-1.0-SNAPSHOT.jar:?] ... 26 more
08-28
<?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>com.crmp.ecc</groupId> <artifactId>crmp-data-syncjob</artifactId> <packaging>pom</packaging> <version>1.0.0-SNAPSHOT</version> <modules> <module>crmp-data-syncjob-common</module> <module>crmp-data-syncjob-dao</module> <module>crmp-data-syncjob-domain</module> <module>crmp-data-syncjob-service</module> <module>crmp-data-syncjob-web</module> </modules> <name>多数据源同步服务</name> <parent> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-parent</artifactId> <version>2.7.18</version> <relativePath/> </parent> <properties> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> <project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding> <java.version>11</java.version> <!-- 框架版本 --> <flink.version>1.19.3</flink.version> <flink-cdc.version>3.2.0</flink-cdc.version> <debezium.version>1.9.8.Final</debezium.version> <scala.binary.version>2.12</scala.binary.version> <!-- 数据库与中间件依赖 --> <mysql.version>8.0.21</mysql.version> <druid.version>1.2.21</druid.version> <mybatis.version>2.3.1</mybatis.version> <kafka-clients.version>2.8.1</kafka-clients.version> <!-- 工具类与文档 --> <lombok.version>1.18.30</lombok.version> <hutool.version>5.8.6</hutool.version> <commons-lang3.version>3.14.0</commons-lang3.version> <!-- 统一SLF4J版本(与Flink 1.19.3兼容) --> <slf4j.version>1.7.36</slf4j.version> <!-- Infinispan统一版本(避免传递依赖冲突) --> <infinispan.version>13.0.20.Final</infinispan.version> </properties> <!-- 依赖管理:统一锁定版本,优先级:BOM > 显式声明 --> <dependencyManagement> <dependencies> <!-- 1. Debezium BOM:优先锁定所有Debezium子依赖版本(包括传递依赖) --> <dependency> <groupId>io.debezium</groupId> <artifactId>debezium-bom</artifactId> <version>${debezium.version}</version> <type>pom</type> <scope>import</scope> </dependency> <!-- 2. Spring Cloud 依赖(与Spring Boot 2.7.18兼容) --> <dependency> <groupId>org.springframework.cloud</groupId> <artifactId>spring-cloud-dependencies</artifactId> <version>2021.0.8</version> <type>pom</type> <scope>import</scope> </dependency> <!-- 3. 核心依赖版本锁定 --> <dependency> <groupId>org.slf4j</groupId> <artifactId>slf4j-api</artifactId> <version>${slf4j.version}</version> </dependency> <dependency> <groupId>org.infinispan</groupId> <artifactId>infinispan-commons</artifactId> <version>${infinispan.version}</version> </dependency> <!-- 4. Debezium核心组件(已通过BOM锁定版本,此处仅声明排除规则) --> <dependency> <groupId>io.debezium</groupId> <artifactId>debezium-core</artifactId> <version>${debezium.version}</version> <exclusions> <exclusion> <groupId>org.slf4j</groupId> <artifactId>slf4j-api</artifactId> </exclusion> <exclusion> <groupId>org.apache.kafka</groupId> <artifactId>kafka-clients</artifactId> </exclusion> <exclusion> <groupId>ch.qos.logback</groupId> <artifactId>logback-classic</artifactId> </exclusion> <!-- 排除有问题的jdk11专用包 --> <exclusion> <groupId>org.infinispan</groupId> <artifactId>infinispan-commons-jdk11</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>io.debezium</groupId> <artifactId>debezium-connector-mysql</artifactId> <version>${debezium.version}</version> <exclusions> <exclusion> <groupId>org.infinispan</groupId> <artifactId>infinispan-commons-jdk11</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>io.debezium</groupId> <artifactId>debezium-connector-oracle</artifactId> <version>${debezium.version}</version> <exclusions> <exclusion> <groupId>org.infinispan</groupId> <artifactId>infinispan-commons-jdk11</artifactId> </exclusion> </exclusions> </dependency> <!-- 5. Flink CDC组件 --> <dependency> <groupId>com.ververica</groupId> <artifactId>flink-connector-mysql-cdc</artifactId> <version>${flink-cdc.version}</version> </dependency> <dependency> <groupId>com.ververica</groupId> <artifactId>flink-connector-oracle-cdc</artifactId> <version>${flink-cdc.version}</version> </dependency> <!-- 6. 子模块版本管理 --> <dependency> <groupId>com.crmp.ecc</groupId> <artifactId>crmp-data-syncjob-common</artifactId> <version>${project.version}</version> </dependency> <dependency> <groupId>com.crmp.ecc</groupId> <artifactId>crmp-data-syncjob-dao</artifactId> <version>${project.version}</version> </dependency> <dependency> <groupId>com.crmp.ecc</groupId> <artifactId>crmp-data-syncjob-domain</artifactId> <version>${project.version}</version> </dependency> <dependency> <groupId>com.crmp.ecc</groupId> <artifactId>crmp-data-syncjob-service</artifactId> <version>${project.version}</version> </dependency> </dependencies> </dependencyManagement> <dependencies> <!-- ========== Spring Boot核心:排除日志冲突 ========== --> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-web</artifactId> <exclusions> <exclusion> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-logging</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-aop</artifactId> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-log4j2</artifactId> <exclusions> <exclusion> <groupId>org.slf4j</groupId> <artifactId>slf4j-reload4j</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-configuration-processor</artifactId> <optional>true</optional> </dependency> <!-- ========== 测试依赖 ========== --> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-test</artifactId> <scope>test</scope> <exclusions> <exclusion> <groupId>org.junit.vintage</groupId> <artifactId>junit-vintage-engine</artifactId> </exclusion> </exclusions> </dependency> <!-- ========== 工具类 ========== --> <dependency> <groupId>org.projectlombok</groupId> <artifactId>lombok</artifactId> <version>${lombok.version}</version> <scope>provided</scope> </dependency> <dependency> <groupId>cn.hutool</groupId> <artifactId>hutool-all</artifactId> <version>${hutool.version}</version> </dependency> <dependency> <groupId>org.apache.commons</groupId> <artifactId>commons-lang3</artifactId> <version>${commons-lang3.version}</version> </dependency> <dependency> <groupId>commons-codec</groupId> <artifactId>commons-codec</artifactId> <version>1.16.0</version> </dependency> <!-- ========== 数据库依赖 ========== --> <dependency> <groupId>mysql</groupId> <artifactId>mysql-connector-java</artifactId> <version>${mysql.version}</version> </dependency> <dependency> <groupId>com.alibaba</groupId> <artifactId>druid-spring-boot-starter</artifactId> <version>${druid.version}</version> <exclusions> <exclusion> <groupId>com.alibaba</groupId> <artifactId>dubbo</artifactId> </exclusion> <exclusion> <groupId>com.googlecode</groupId> <artifactId>hibernate-memcached</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>org.mybatis.spring.boot</groupId> <artifactId>mybatis-spring-boot-starter</artifactId> <version>${mybatis.version}</version> </dependency> <!-- Oracle驱动 --> <dependency> <groupId>com.oracle.database.jdbc</groupId> <artifactId>ojdbc10</artifactId> <version>19.10.0.0</version> <scope>runtime</scope> </dependency> <dependency> <groupId>com.oracle.database.nls</groupId> <artifactId>orai18n</artifactId> <version>19.10.0.0</version> </dependency> <!-- 人大金仓 --> <dependency> <groupId>com.kingbase8.jdbc</groupId> <artifactId>kingbase8</artifactId> <version>8.6.0</version> <scope>runtime</scope> </dependency> <!-- ========== Flink核心依赖 ========== --> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-clients</artifactId> <version>${flink.version}</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-table-api-java-bridge</artifactId> <version>${flink.version}</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-table-planner_${scala.binary.version}</artifactId> <version>${flink.version}</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-statebackend-rocksdb</artifactId> <version>${flink.version}</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-json</artifactId> <version>${flink.version}</version> <scope>provided</scope> </dependency> <dependency> <groupId>com.google.guava</groupId> <artifactId>guava</artifactId> <version>31.1-jre</version> <scope>provided</scope> </dependency> <!-- ========== CDC连接器(核心修改:排除Infinispan-jdk11依赖) ========== --> <!-- Mysql CDC:排除传递的Debezium和Infinispan问题依赖 --> <dependency> <groupId>com.ververica</groupId> <artifactId>flink-connector-mysql-cdc</artifactId> <exclusions> <exclusion> <groupId>org.apache.flink</groupId> <artifactId>flink-core</artifactId> </exclusion> <exclusion> <groupId>org.apache.flink</groupId> <artifactId>flink-streaming-java</artifactId> </exclusion> <exclusion> <groupId>io.debezium</groupId> <artifactId>*</artifactId> </exclusion> <exclusion> <groupId>ch.qos.logback</groupId> <artifactId>logback-classic</artifactId> </exclusion> <!-- 排除Flink CDC传递的Infinispan问题依赖 --> <exclusion> <groupId>org.infinispan</groupId> <artifactId>infinispan-commons-jdk11</artifactId> </exclusion> </exclusions> </dependency> <!-- Oracle CDC:排除传递的Debezium和Infinispan问题依赖 --> <dependency> <groupId>com.ververica</groupId> <artifactId>flink-connector-oracle-cdc</artifactId> <exclusions> <exclusion> <groupId>org.apache.flink</groupId> <artifactId>flink-core</artifactId> </exclusion> <exclusion> <groupId>io.debezium</groupId> <artifactId>*</artifactId> </exclusion> <exclusion> <groupId>ch.qos.logback</groupId> <artifactId>logback-classic</artifactId> </exclusion> <exclusion> <groupId>org.infinispan</groupId> <artifactId>infinispan-commons-jdk11</artifactId> </exclusion> </exclusions> </dependency> <!-- 人大金仓CDC:排除传递的Debezium依赖 --> <dependency> <groupId>com.kingbase</groupId> <artifactId>flink-sql-cdc-connector-kes-v2</artifactId> <version>3.2-SNAPSHOT</version> <exclusions> <exclusion> <groupId>io.debezium</groupId> <artifactId>*</artifactId> </exclusion> <exclusion> <groupId>org.infinispan</groupId> <artifactId>infinispan-commons-jdk11</artifactId> </exclusion> </exclusions> </dependency> <!-- ========== 显式引入Debezium核心组件(版本由dependencyManagement锁定) ========== --> <dependency> <groupId>io.debezium</groupId> <artifactId>debezium-core</artifactId> </dependency> <dependency> <groupId>io.debezium</groupId> <artifactId>debezium-connector-mysql</artifactId> </dependency> <dependency> <groupId>io.debezium</groupId> <artifactId>debezium-connector-oracle</artifactId> </dependency> <!-- ========== Flink补充依赖 ========== --> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-core</artifactId> <version>${flink.version}</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-streaming-java</artifactId> <version>${flink.version}</version> <exclusions> <exclusion> <groupId>org.apache.commons</groupId> <artifactId>commons-math3</artifactId> </exclusion> </exclusions> <scope>provided</scope> </dependency> <dependency> <groupId>org.apache.commons</groupId> <artifactId>commons-math3</artifactId> <version>3.6.1</version> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-connector-base</artifactId> <version>${flink.version}</version> </dependency> <!-- ========== Kafka依赖 ========== --> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-connector-kafka</artifactId> <version>3.2.0-1.19</version> <exclusions> <exclusion> <groupId>org.apache.kafka</groupId> <artifactId>kafka-clients</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>org.apache.kafka</groupId> <artifactId>kafka-clients</artifactId> <version>${kafka-clients.version}</version> </dependency> <dependency> <groupId>org.apache.kafka</groupId> <artifactId>connect-api</artifactId> <version>${kafka-clients.version}</version> <exclusions> <exclusion> <groupId>org.apache.kafka</groupId> <artifactId>kafka-clients</artifactId> </exclusion> <exclusion> <groupId>org.slf4j</groupId> <artifactId>slf4j-api</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>org.apache.kafka</groupId> <artifactId>connect-json</artifactId> <version>${kafka-clients.version}</version> <exclusions> <exclusion> <groupId>org.apache.kafka</groupId> <artifactId>kafka-clients</artifactId> </exclusion> <exclusion> <groupId>org.slf4j</groupId> <artifactId>slf4j-api</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>org.apache.kafka</groupId> <artifactId>connect-transforms</artifactId> <version>${kafka-clients.version}</version> <exclusions> <exclusion> <groupId>org.apache.kafka</groupId> <artifactId>kafka-clients</artifactId> </exclusion> <exclusion> <groupId>org.slf4j</groupId> <artifactId>slf4j-api</artifactId> </exclusion> </exclusions> </dependency> <!-- ========== Doris连接器 ========== --> <dependency> <groupId>org.apache.doris</groupId> <artifactId>flink-doris-connector-1.19</artifactId> <version>25.1.0</version> <exclusions> <exclusion> <groupId>org.slf4j</groupId> <artifactId>slf4j-api</artifactId> </exclusion> <exclusion> <groupId>com.fasterxml.jackson.core</groupId> <artifactId>jackson-databind</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>com.fasterxml.jackson.core</groupId> <artifactId>jackson-databind</artifactId> <version>2.12.7</version> </dependency> <!-- ========== Hadoop支持 ========== --> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-hadoop-compatibility_2.12</artifactId> <version>${flink.version}</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-hdfs</artifactId> <version>3.3.6</version> <scope>provided</scope> <exclusions> <exclusion> <groupId>org.slf4j</groupId> <artifactId>slf4j-log4j12</artifactId> </exclusion> <exclusion> <groupId>log4j</groupId> <artifactId>log4j</artifactId> </exclusion> <exclusion> <groupId>org.apache.yetus</groupId> <artifactId>audience-annotations</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-common</artifactId> <version>3.3.6</version> <scope>provided</scope> <exclusions> <exclusion> <groupId>org.slf4j</groupId> <artifactId>slf4j-log4j12</artifactId> </exclusion> <exclusion> <groupId>log4j</groupId> <artifactId>log4j</artifactId> </exclusion> <exclusion> <groupId>org.apache.yetus</groupId> <artifactId>audience-annotations</artifactId> </exclusion> </exclusions> </dependency> <!-- ========== 其他工具依赖 ========== --> <dependency> <groupId>com.alibaba</groupId> <artifactId>fastjson</artifactId> <version>2.0.39</version> </dependency> <dependency> <groupId>javax.servlet</groupId> <artifactId>javax.servlet-api</artifactId> <version>3.1.0</version> <scope>provided</scope> </dependency> <dependency> <groupId>com.github.pagehelper</groupId> <artifactId>pagehelper</artifactId> <version>5.3.2</version> </dependency> <dependency> <groupId>org.json</groupId> <artifactId>json</artifactId> <version>20231013</version> </dependency> </dependencies> <!-- 环境配置 --> <profiles> <profile> <id>dev</id> <activation> <activeByDefault>false</activeByDefault> </activation> <properties> <profileActive>dev</profileActive> </properties> </profile> <profile> <id>prod</id> <activation> <activeByDefault>true</activeByDefault> </activation> <properties> <profileActive>prod</profileActive> </properties> </profile> </profiles> </project>这是pom配置,这是输出 PS D:\kingbaseProject\crmp-data-syncjob> mvn clean dependency:tree -U -Dincludes="io.debezium:*,org.apache.kafka:*" [INFO] Scanning for projects... [INFO] ------------------------------------------------------------------------ [INFO] Reactor Build Order: [INFO] [INFO] 多数据源同步服务 [pom] [INFO] crmp-data-syncjob-common [jar] [INFO] crmp-data-syncjob-domain [jar] [INFO] crmp-data-syncjob-dao [jar] [INFO] crmp-data-syncjob-service [jar] [INFO] crmp-data-syncjob-web [jar] [INFO] [INFO] -------------------< com.crmp.ecc:crmp-data-syncjob >------------------- [INFO] Building 多数据源同步服务 1.0.0-SNAPSHOT [1/6] [INFO] from pom.xml [INFO] --------------------------------[ pom ]--------------------------------- [INFO] [INFO] --- maven-clean-plugin:3.2.0:clean (default-clean) @ crmp-data-syncjob --- [INFO] [INFO] --- maven-dependency-plugin:3.3.0:tree (default-cli) @ crmp-data-syncjob --- [INFO] com.crmp.ecc:crmp-data-syncjob:pom:1.0.0-SNAPSHOT [INFO] +- io.debezium:debezium-core:jar:1.9.8.Final:compile [INFO] | \- io.debezium:debezium-api:jar:1.9.8.Final:compile [INFO] +- io.debezium:debezium-connector-mysql:jar:1.9.8.Final:compile [INFO] | \- io.debezium:debezium-ddl-parser:jar:1.9.8.Final:compile [INFO] +- io.debezium:debezium-connector-oracle:jar:1.9.8.Final:compile [INFO] +- org.apache.kafka:kafka-clients:jar:2.8.1:compile [INFO] +- org.apache.kafka:connect-api:jar:2.8.1:compile [INFO] +- org.apache.kafka:connect-json:jar:2.8.1:compile [INFO] \- org.apache.kafka:connect-transforms:jar:2.8.1:compile [INFO] [INFO] ---------------< com.crmp.ecc:crmp-data-syncjob-common >---------------- [INFO] Building crmp-data-syncjob-common 1.0.0-SNAPSHOT [2/6] [INFO] from crmp-data-syncjob-common\pom.xml [INFO] --------------------------------[ jar ]--------------------------------- [INFO] [INFO] --- maven-clean-plugin:3.2.0:clean (default-clean) @ crmp-data-syncjob-common --- [INFO] [INFO] --- maven-dependency-plugin:3.3.0:tree (default-cli) @ crmp-data-syncjob-common --- [INFO] com.crmp.ecc:crmp-data-syncjob-common:jar:1.0.0-SNAPSHOT [INFO] +- io.debezium:debezium-core:jar:1.9.8.Final:compile [INFO] | \- io.debezium:debezium-api:jar:1.9.8.Final:compile [INFO] +- io.debezium:debezium-connector-mysql:jar:1.9.8.Final:compile [INFO] | \- io.debezium:debezium-ddl-parser:jar:1.9.8.Final:compile [INFO] +- io.debezium:debezium-connector-oracle:jar:1.9.8.Final:compile [INFO] +- org.apache.kafka:kafka-clients:jar:2.8.1:compile [INFO] +- org.apache.kafka:connect-api:jar:2.8.1:compile [INFO] +- org.apache.kafka:connect-json:jar:2.8.1:compile [INFO] \- org.apache.kafka:connect-transforms:jar:2.8.1:compile [INFO] [INFO] ---------------< com.crmp.ecc:crmp-data-syncjob-domain >---------------- [INFO] Building crmp-data-syncjob-domain 1.0.0-SNAPSHOT [3/6] [INFO] from crmp-data-syncjob-domain\pom.xml [INFO] --------------------------------[ jar ]--------------------------------- [INFO] [INFO] --- maven-clean-plugin:3.2.0:clean (default-clean) @ crmp-data-syncjob-domain --- [INFO] [INFO] --- maven-dependency-plugin:3.3.0:tree (default-cli) @ crmp-data-syncjob-domain --- [INFO] com.crmp.ecc:crmp-data-syncjob-domain:jar:1.0.0-SNAPSHOT [INFO] +- io.debezium:debezium-core:jar:1.9.8.Final:compile [INFO] | \- io.debezium:debezium-api:jar:1.9.8.Final:compile [INFO] +- io.debezium:debezium-connector-mysql:jar:1.9.8.Final:compile [INFO] | \- io.debezium:debezium-ddl-parser:jar:1.9.8.Final:compile [INFO] +- io.debezium:debezium-connector-oracle:jar:1.9.8.Final:compile [INFO] +- org.apache.kafka:kafka-clients:jar:2.8.1:compile [INFO] +- org.apache.kafka:connect-api:jar:2.8.1:compile [INFO] +- org.apache.kafka:connect-json:jar:2.8.1:compile [INFO] \- org.apache.kafka:connect-transforms:jar:2.8.1:compile [INFO] [INFO] -----------------< com.crmp.ecc:crmp-data-syncjob-dao >----------------- [INFO] Building crmp-data-syncjob-dao 1.0.0-SNAPSHOT [4/6] [INFO] from crmp-data-syncjob-dao\pom.xml [INFO] --------------------------------[ jar ]--------------------------------- [INFO] [INFO] --- maven-clean-plugin:3.2.0:clean (default-clean) @ crmp-data-syncjob-dao --- [INFO] [INFO] --- maven-dependency-plugin:3.3.0:tree (default-cli) @ crmp-data-syncjob-dao --- [INFO] com.crmp.ecc:crmp-data-syncjob-dao:jar:1.0.0-SNAPSHOT [INFO] +- io.debezium:debezium-core:jar:1.9.8.Final:compile [INFO] | \- io.debezium:debezium-api:jar:1.9.8.Final:compile [INFO] +- io.debezium:debezium-connector-mysql:jar:1.9.8.Final:compile [INFO] | \- io.debezium:debezium-ddl-parser:jar:1.9.8.Final:compile [INFO] +- io.debezium:debezium-connector-oracle:jar:1.9.8.Final:compile [INFO] +- org.apache.kafka:kafka-clients:jar:2.8.1:compile [INFO] +- org.apache.kafka:connect-api:jar:2.8.1:compile [INFO] +- org.apache.kafka:connect-json:jar:2.8.1:compile [INFO] \- org.apache.kafka:connect-transforms:jar:2.8.1:compile [INFO] [INFO] ---------------< com.crmp.ecc:crmp-data-syncjob-service >--------------- [INFO] Building crmp-data-syncjob-service 1.0.0-SNAPSHOT [5/6] [INFO] from crmp-data-syncjob-service\pom.xml [INFO] --------------------------------[ jar ]--------------------------------- [INFO] [INFO] --- maven-clean-plugin:3.2.0:clean (default-clean) @ crmp-data-syncjob-service --- [INFO] [INFO] --- maven-dependency-plugin:3.3.0:tree (default-cli) @ crmp-data-syncjob-service --- [INFO] com.crmp.ecc:crmp-data-syncjob-service:jar:1.0.0-SNAPSHOT [INFO] +- io.debezium:debezium-core:jar:1.9.8.Final:compile [INFO] | \- io.debezium:debezium-api:jar:1.9.8.Final:compile [INFO] +- io.debezium:debezium-connector-mysql:jar:1.9.8.Final:compile [INFO] | \- io.debezium:debezium-ddl-parser:jar:1.9.8.Final:compile [INFO] +- io.debezium:debezium-connector-oracle:jar:1.9.8.Final:compile [INFO] +- org.apache.kafka:kafka-clients:jar:2.8.1:compile [INFO] +- org.apache.kafka:connect-api:jar:2.8.1:compile [INFO] +- org.apache.kafka:connect-json:jar:2.8.1:compile [INFO] \- org.apache.kafka:connect-transforms:jar:2.8.1:compile [INFO] [INFO] -----------------< com.crmp.ecc:crmp-data-syncjob-web >----------------- [INFO] Building crmp-data-syncjob-web 1.0.0-SNAPSHOT [6/6] [INFO] from crmp-data-syncjob-web\pom.xml [INFO] --------------------------------[ jar ]--------------------------------- [INFO] [INFO] --- maven-clean-plugin:3.2.0:clean (default-clean) @ crmp-data-syncjob-web --- [INFO] [INFO] --- maven-dependency-plugin:3.3.0:tree (default-cli) @ crmp-data-syncjob-web --- [INFO] com.crmp.ecc:crmp-data-syncjob-web:jar:1.0.0-SNAPSHOT [INFO] +- io.debezium:debezium-core:jar:1.9.8.Final:compile [INFO] | \- io.debezium:debezium-api:jar:1.9.8.Final:compile [INFO] +- io.debezium:debezium-connector-mysql:jar:1.9.8.Final:compile [INFO] | \- io.debezium:debezium-ddl-parser:jar:1.9.8.Final:compile [INFO] +- io.debezium:debezium-connector-oracle:jar:1.9.8.Final:compile [INFO] +- org.apache.kafka:kafka-clients:jar:2.8.1:compile [INFO] +- org.apache.kafka:connect-api:jar:2.8.1:compile [INFO] +- org.apache.kafka:connect-json:jar:2.8.1:compile [INFO] \- org.apache.kafka:connect-transforms:jar:2.8.1:compile [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary for 多数据源同步服务 1.0.0-SNAPSHOT: [INFO] [INFO] 多数据源同步服务 ........................................... SUCCESS [ 1.648 s] [INFO] crmp-data-syncjob-common ........................... SUCCESS [ 0.115 s] [INFO] crmp-data-syncjob-domain ........................... SUCCESS [ 0.085 s] [INFO] crmp-data-syncjob-dao .............................. SUCCESS [ 0.131 s] [INFO] crmp-data-syncjob-service .......................... SUCCESS [ 0.113 s] [INFO] crmp-data-syncjob-web .............................. SUCCESS [ 0.121 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 2.683 s [INFO] Finished at: 2025-11-25T15:00:10+08:00 [INFO] ------------------------------------------------------------------------ ,这还是报错 2025-11-25 14:34:23 java.lang.NoSuchMethodError: 'io.debezium.config.Field io.debezium.config.Field.withType(org.apache.kafka.common.config.ConfigDef$Type)' at io.debezium.relational.HistorizedRelationalDatabaseConnectorConfig.<clinit>(HistorizedRelationalDatabaseConnectorConfig.java:48) at io.debezium.relational.RelationalDatabaseConnectorConfig.lambda$new$0(RelationalDatabaseConnectorConfig.java:636) at java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:176) at java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:177) at java.base/java.util.HashMap$KeySpliterator.forEachRemaining(HashMap.java:1810) at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484) at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474) at java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913) at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) at java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578) at io.debezium.config.Configuration$5.keys(Configuration.java:1608) at io.debezium.config.Configuration$5.keys(Configuration.java:1605) at io.debezium.config.Configuration.map(Configuration.java:1567) at io.debezium.config.Configuration.subset(Configuration.java:1553) at io.debezium.relational.RelationalDatabaseConnectorConfig.<init>(RelationalDatabaseConnectorConfig.java:638) at io.debezium.connector.kes.KingbaseConnectorConfig.<init>(KingbaseConnectorConfig.java:1024) at org.apache.flink.cdc.connectors.kes.source.config.KingbaseSourceConfig.getDbzConnectorConfig(KingbaseSourceConfig.java:129) at org.apache.flink.cdc.connectors.kes.source.config.KingbaseSourceConfig.getDbzConnectorConfig(KingbaseSourceConfig.java:35) at org.apache.flink.cdc.connectors.base.config.JdbcSourceConfig.getTableFilters(JdbcSourceConfig.java:165) at org.apache.flink.cdc.connectors.kes.source.KingbaseDialect.isIncludeDataCollection(KingbaseDialect.java:219) at org.apache.flink.cdc.connectors.kes.source.KingbaseDialect.isIncludeDataCollection(KingbaseDialect.java:59) at org.apache.flink.cdc.connectors.base.source.reader.IncrementalSourceReader.addSplits(IncrementalSourceReader.java:266) at org.apache.flink.cdc.connectors.base.source.reader.IncrementalSourceReader.addSplits(IncrementalSourceReader.java:248) at org.apache.flink.streaming.api.operators.SourceOperator.handleAddSplitsEvent(SourceOperator.java:625) at org.apache.flink.streaming.api.operators.SourceOperator.handleOperatorEvent(SourceOperator.java:595) at org.apache.flink.streaming.runtime.tasks.OperatorEventDispatcherImpl.dispatchEventToHandlers(OperatorEventDispatcherImpl.java:72) at org.apache.flink.streaming.runtime.tasks.RegularOperatorChain.dispatchOperatorEvent(RegularOperatorChain.java:80) at org.apache.flink.streaming.runtime.tasks.StreamTask.lambda$dispatchOperatorEvent$22(StreamTask.java:1548) at org.apache.flink.streaming.runtime.tasks.StreamTaskActionExecutor$1.runThrowing(StreamTaskActionExecutor.java:50) at org.apache.flink.streaming.runtime.tasks.mailbox.Mail.run(Mail.java:90) at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMail(MailboxProcessor.java:398) at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.processMailsWhenDefaultActionUnavailable(MailboxProcessor.java:367) at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.processMail(MailboxProcessor.java:352) at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:229) at org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:917) at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:859) at org.apache.flink.runtime.taskmanager.Task.runWithSystemExitMonitoring(Task.java:958) at org.apache.flink.runtime.taskmanager.Task.restoreAndInvoke(Task.java:937) at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:751) at org.apache.flink.runtime.taskmanager.Task.run(Task.java:566) at java.base/java.lang.Thread.run(Thread.java:829) 2025-11-25 14:34:53 java.lang.NoClassDefFoundError: Could not initialize class io.debezium.relational.HistorizedRelationalDatabaseConnectorConfig at io.debezium.relational.RelationalDatabaseConnectorConfig.lambda$new$0(RelationalDatabaseConnectorConfig.java:636) at java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:176) at java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:177) at java.base/java.util.HashMap$KeySpliterator.forEachRemaining(HashMap.java:1810) at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484) at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474) at java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913) at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) at java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578) at io.debezium.config.Configuration$5.keys(Configuration.java:1608) at io.debezium.config.Configuration$5.keys(Configuration.java:1605) at io.debezium.config.Configuration.map(Configuration.java:1567) at io.debezium.config.Configuration.subset(Configuration.java:1553) at io.debezium.relational.RelationalDatabaseConnectorConfig.<init>(RelationalDatabaseConnectorConfig.java:638) at io.debezium.connector.kes.KingbaseConnectorConfig.<init>(KingbaseConnectorConfig.java:1024) at org.apache.flink.cdc.connectors.kes.source.config.KingbaseSourceConfig.getDbzConnectorConfig(KingbaseSourceConfig.java:129) at org.apache.flink.cdc.connectors.kes.source.config.KingbaseSourceConfig.getDbzConnectorConfig(KingbaseSourceConfig.java:35) at org.apache.flink.cdc.connectors.base.config.JdbcSourceConfig.getTableFilters(JdbcSourceConfig.java:165) at org.apache.flink.cdc.connectors.kes.source.KingbaseDialect.isIncludeDataCollection(KingbaseDialect.java:219) at org.apache.flink.cdc.connectors.kes.source.KingbaseDialect.isIncludeDataCollection(KingbaseDialect.java:59) at org.apache.flink.cdc.connectors.base.source.reader.IncrementalSourceReader.addSplits(IncrementalSourceReader.java:266) at org.apache.flink.cdc.connectors.base.source.reader.IncrementalSourceReader.addSplits(IncrementalSourceReader.java:248) at org.apache.flink.streaming.api.operators.SourceOperator.handleAddSplitsEvent(SourceOperator.java:625) at org.apache.flink.streaming.api.operators.SourceOperator.handleOperatorEvent(SourceOperator.java:595) at org.apache.flink.streaming.runtime.tasks.OperatorEventDispatcherImpl.dispatchEventToHandlers(OperatorEventDispatcherImpl.java:72) at org.apache.flink.streaming.runtime.tasks.RegularOperatorChain.dispatchOperatorEvent(RegularOperatorChain.java:80) at org.apache.flink.streaming.runtime.tasks.StreamTask.lambda$dispatchOperatorEvent$22(StreamTask.java:1548) at org.apache.flink.streaming.runtime.tasks.StreamTaskActionExecutor$1.runThrowing(StreamTaskActionExecutor.java:50) at org.apache.flink.streaming.runtime.tasks.mailbox.Mail.run(Mail.java:90) at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMail(MailboxProcessor.java:398) at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.processMailsWhenDefaultActionUnavailable(MailboxProcessor.java:367) at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.processMail(MailboxProcessor.java:352) at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:229) at org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:917) at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:859) at org.apache.flink.runtime.taskmanager.Task.runWithSystemExitMonitoring(Task.java:958) at org.apache.flink.runtime.taskmanager.Task.restoreAndInvoke(Task.java:937) at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:751) at org.apache.flink.runtime.taskmanager.Task.run(Task.java:566) at java.base/java.lang.Thread.run(Thread.java:829)
最新发布
11-26
评论
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值