<<core java 7>>和<<core java 8>>之diff

本文总结了从 Core Java 7 到 Core Java 8 的主要变化,特别是新增和改进的功能。包括 StAX Parser 的引入、JDBC 4 的新特性、Swing 和 AWT 的增强、安全性提升、分布式对象的新进展以及注解处理的重大更新。
这几天把<<core java 8>>看完了,由于之前已经看过<<core java 7>>,因此这次只是挑有修改、增加的、重要的地方看,下面我把我看的主要章节列出来,以方便同样看过第7版又想看第8版以学习一些jdk6新特性的朋友:

Volumn 1: 第一卷中的内容比较基础,我对比了一下各章节的目录,差别不大,因此大部分内容我都跳过没有看。
Chapter 10.Developing Applications and Applets:
Java Web Start这一节可以重看一下,里面更新了一些内容:install desktop and menu shortcuts, supply icon, Signed Code等。

Volumn 2: 大部分修改和新增的内容都在这一卷中。
Chapter 2. XML:
Streaming Parsers一节:新增了StAX Parser的内容。
Generating XML Documents一节:新增了Writing an XML Document with StAX。

Chapter 4. Database Programming:
前言部分
JDBC Configuration这节:关于jdk6自带的derby数据库的配置和使用
Executing SQL Statements
Query Execution
Metadata
Transactions
Connection Management in Web and Enterprise Applications
以上章节都有一定程序的修改和补充,主要是关于JDBC4的一些新feature的介绍,也有原先内容的一些修订,可以重看一下。

Chapter 6. Advanced Swing:
Tables一节:修改了关于排序和过滤的部分,用到了jdk6中新增的两个接口:TableRowSorter和RowFilter。

Chapter 7. Advanced AWT:
新增了Platform Integration一节。

Chapter 9. Security:
Class Loaders:有少量修改
Digital Signatures:增加了Certificate Requests一段

Chapter 10. Distributed Objects:
The RMI Programming Model
Parameters and Return Values in Remote Methods
Remote Object Activation
Web Services and JAX-WS
以上几节都有部分修改和增补。

Chapter 11. Scripting, Compiling, and Annotation Processing:
增加了Scripting for the Java Platform一节
增加了The Compiler API一节
Using Annotations
Annotation Syntax
Standard Annotations
Source-Level Annotation Processing
Bytecode Engineering
以上关于annotation的几节都有不同程度的修改,主要增加了jdk6中annotation的新feature,由于annotation在java ee 5中扮演了很重要的角色,而java ee 5主要是基于jdk6的,因此这几节建议重看一下。
我这个代码可以解决问题不:<?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>org.example</groupId> <artifactId>spark-mongodb-processor</artifactId> <version>1.0-SNAPSHOT</version> <properties> <maven.compiler.source>8</maven.compiler.source> <maven.compiler.target>8</maven.compiler.target> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> <!-- 统一版本配置 --> <spark.version>3.5.6</spark.version> <scala.binary.version>2.12</scala.binary.version> <scala.version>2.12.18</scala.version> <mongodb.connector.version>10.2.1</mongodb.connector.version> <mongodb.driver.version>4.11.1</mongodb.driver.version> <hadoop.version>3.3.6</hadoop.version> <log4j.version>2.20.0</log4j.version> <shade.plugin.version>3.5.1</shade.plugin.version> <scala.maven.plugin.version>4.8.1</scala.maven.plugin.version> <compiler.plugin.version>3.11.0</compiler.plugin.version> </properties> <dependencies> <!-- Spark 核心依赖 --> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_${scala.binary.version}</artifactId> <version>${spark.version}</version> <scope>provided</scope> </dependency> <!-- Spark SQL 依赖 --> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-sql_${scala.binary.version}</artifactId> <version>${spark.version}</version> <scope>provided</scope> </dependency> <!-- MongoDB Spark Connector --> <dependency> <groupId>org.mongodb.spark</groupId> <artifactId>mongo-spark-connector_${scala.binary.version}</artifactId> <version>${mongodb.connector.version}</version> <!-- 简化排除配置 --> <exclusions> <exclusion> <groupId>org.mongodb</groupId> <artifactId>*</artifactId> </exclusion> </exclusions> </dependency> <!-- MongoDB 驱动依赖 - 统一版本 --> <dependency> <groupId>org.mongodb</groupId> <artifactId>mongodb-driver-sync</artifactId> <version>${mongodb.driver.version}</version> </dependency> <!-- Scala 语言依赖 --> <dependency> <groupId>org.scala-lang</groupId> <artifactId>scala-library</artifactId> <version>${scala.version}</version> </dependency> <!-- Hadoop 客户端 - 更新到较新版本 --> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-client</artifactId> <version>${hadoop.version}</version> <scope>provided</scope> </dependency> <!-- 日志处理 --> <dependency> <groupId>org.apache.logging.log4j</groupId> <artifactId>log4j-api</artifactId> <version>${log4j.version}</version> </dependency> <dependency> <groupId>org.apache.logging.log4j</groupId> <artifactId>log4j-core</artifactId> <version>${log4j.version}</version> </dependency> <!-- 添加 SLF4J 桥接,避免日志冲突 --> <dependency> <groupId>org.apache.logging.log4j</groupId> <artifactId>log4j-slf4j-impl</artifactId> <version>${log4j.version}</version> </dependency> </dependencies> <build> <plugins> <!-- Scala 编译插件 --> <plugin> <groupId>net.alchim31.maven</groupId> <artifactId>scala-maven-plugin</artifactId> <version>${scala.maven.plugin.version}</version> <executions> <execution> <id>scala-compile-first</id> <phase>process-resources</phase> <goals> <goal>add-source</goal> <goal>compile</goal> </goals> </execution> <execution> <id>scala-test-compile</id> <phase>process-test-resources</phase> <goals> <goal>testCompile</goal> </goals> </execution> </executions> <configuration> <scalaVersion>${scala.version}</scalaVersion> <args> <arg>-target:jvm-1.8</arg> </args> </configuration> </plugin> <!-- Java 编译插件 --> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-compiler-plugin</artifactId> <version>${compiler.plugin.version}</version> <configuration> <source>${maven.compiler.source}</source> <target>${maven.compiler.target}</target> <encoding>${project.build.sourceEncoding}</encoding> </configuration> </plugin> <!-- Shade 打包插件 --> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-shade-plugin</artifactId> <version>${shade.plugin.version}</version> <executions> <execution> <phase>package</phase> <goals> <goal>shade</goal> </goals> <configuration> <shadedArtifactAttached>true</shadedArtifactAttached> <shadedClassifierName>shaded</shadedClassifierName> <createDependencyReducedPom>true</createDependencyReducedPom> <transformers> <transformer implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer"> <!-- 请确认这个主类路径是否正确 --> <mainClass>EmotionAnalysis.EmoAnalysis</mainClass> </transformer> <transformer implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer"> <resource>reference.conf</resource> </transformer> <transformer implementation="org.apache.maven.plugins.shade.resource.ServicesResourceTransformer"/> </transformers> <filters> <filter> <artifact>*:*</artifact> <excludes> <exclude>META-INF/*.SF</exclude> <exclude>META-INF/*.DSA</exclude> <exclude>META-INF/*.RSA</exclude> <exclude>module-info.class</exclude> <!-- 排除不必要的许可证文件 --> <exclude>LICENSE</exclude> <exclude>NOTICE</exclude> </excludes> </filter> </filters> <!-- 优化包含配置 --> <artifactSet> <includes> <include>org.mongodb:*</include> <include>org.mongodb.spark:*</include> <include>org.apache.logging.log4j:*</include> </includes> </artifactSet> </configuration> </execution> </executions> </plugin> </plugins> </build> </project>
10-24
<?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <groupId>com.crmp.ecc</groupId> <artifactId>crmp-data-syncjob</artifactId> <packaging>pom</packaging> <version>1.0.0-SNAPSHOT</version> <modules> <module>crmp-data-syncjob-common</module> <module>crmp-data-syncjob-dao</module> <module>crmp-data-syncjob-domain</module> <module>crmp-data-syncjob-service</module> <module>crmp-data-syncjob-web</module> </modules> <name>多数据源同步服务</name> <parent> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-parent</artifactId> <version>2.7.18</version> <relativePath/> </parent> <properties> <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding> <project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding> <java.version>11</java.version> <!-- 框架版本 --> <flink.version>1.19.3</flink.version> <flink-cdc.version>3.2.0</flink-cdc.version> <debezium.version>1.9.8.Final</debezium.version> <scala.binary.version>2.12</scala.binary.version> <!-- 数据库与中间件依赖 --> <mysql.version>8.0.21</mysql.version> <druid.version>1.2.21</druid.version> <mybatis.version>2.3.1</mybatis.version> <kafka-clients.version>2.8.1</kafka-clients.version> <!-- 工具类与文档 --> <lombok.version>1.18.30</lombok.version> <hutool.version>5.8.6</hutool.version> <commons-lang3.version>3.14.0</commons-lang3.version> <!-- 统一SLF4J版本(与Flink 1.19.3兼容) --> <slf4j.version>1.7.36</slf4j.version> <!-- Infinispan统一版本(避免传递依赖冲突) --> <infinispan.version>13.0.20.Final</infinispan.version> </properties> <!-- 依赖管理:统一锁定版本,优先级:BOM > 显式声明 --> <dependencyManagement> <dependencies> <!-- 1. Debezium BOM:优先锁定所有Debezium子依赖版本(包括传递依赖) --> <dependency> <groupId>io.debezium</groupId> <artifactId>debezium-bom</artifactId> <version>${debezium.version}</version> <type>pom</type> <scope>import</scope> </dependency> <!-- 2. Spring Cloud 依赖(与Spring Boot 2.7.18兼容) --> <dependency> <groupId>org.springframework.cloud</groupId> <artifactId>spring-cloud-dependencies</artifactId> <version>2021.0.8</version> <type>pom</type> <scope>import</scope> </dependency> <!-- 3. 核心依赖版本锁定 --> <dependency> <groupId>org.slf4j</groupId> <artifactId>slf4j-api</artifactId> <version>${slf4j.version}</version> </dependency> <dependency> <groupId>org.infinispan</groupId> <artifactId>infinispan-commons</artifactId> <version>${infinispan.version}</version> </dependency> <!-- 4. Debezium核心组件(已通过BOM锁定版本,此处仅声明排除规则) --> <dependency> <groupId>io.debezium</groupId> <artifactId>debezium-core</artifactId> <version>${debezium.version}</version> <exclusions> <exclusion> <groupId>org.slf4j</groupId> <artifactId>slf4j-api</artifactId> </exclusion> <exclusion> <groupId>org.apache.kafka</groupId> <artifactId>kafka-clients</artifactId> </exclusion> <exclusion> <groupId>ch.qos.logback</groupId> <artifactId>logback-classic</artifactId> </exclusion> <!-- 排除有问题的jdk11专用包 --> <exclusion> <groupId>org.infinispan</groupId> <artifactId>infinispan-commons-jdk11</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>io.debezium</groupId> <artifactId>debezium-connector-mysql</artifactId> <version>${debezium.version}</version> <exclusions> <exclusion> <groupId>org.infinispan</groupId> <artifactId>infinispan-commons-jdk11</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>io.debezium</groupId> <artifactId>debezium-connector-oracle</artifactId> <version>${debezium.version}</version> <exclusions> <exclusion> <groupId>org.infinispan</groupId> <artifactId>infinispan-commons-jdk11</artifactId> </exclusion> </exclusions> </dependency> <!-- 5. Flink CDC组件 --> <dependency> <groupId>com.ververica</groupId> <artifactId>flink-connector-mysql-cdc</artifactId> <version>${flink-cdc.version}</version> </dependency> <dependency> <groupId>com.ververica</groupId> <artifactId>flink-connector-oracle-cdc</artifactId> <version>${flink-cdc.version}</version> </dependency> <!-- 6. 子模块版本管理 --> <dependency> <groupId>com.crmp.ecc</groupId> <artifactId>crmp-data-syncjob-common</artifactId> <version>${project.version}</version> </dependency> <dependency> <groupId>com.crmp.ecc</groupId> <artifactId>crmp-data-syncjob-dao</artifactId> <version>${project.version}</version> </dependency> <dependency> <groupId>com.crmp.ecc</groupId> <artifactId>crmp-data-syncjob-domain</artifactId> <version>${project.version}</version> </dependency> <dependency> <groupId>com.crmp.ecc</groupId> <artifactId>crmp-data-syncjob-service</artifactId> <version>${project.version}</version> </dependency> </dependencies> </dependencyManagement> <dependencies> <!-- ========== Spring Boot核心:排除日志冲突 ========== --> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-web</artifactId> <exclusions> <exclusion> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-logging</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-aop</artifactId> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-log4j2</artifactId> <exclusions> <exclusion> <groupId>org.slf4j</groupId> <artifactId>slf4j-reload4j</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-configuration-processor</artifactId> <optional>true</optional> </dependency> <!-- ========== 测试依赖 ========== --> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-test</artifactId> <scope>test</scope> <exclusions> <exclusion> <groupId>org.junit.vintage</groupId> <artifactId>junit-vintage-engine</artifactId> </exclusion> </exclusions> </dependency> <!-- ========== 工具类 ========== --> <dependency> <groupId>org.projectlombok</groupId> <artifactId>lombok</artifactId> <version>${lombok.version}</version> <scope>provided</scope> </dependency> <dependency> <groupId>cn.hutool</groupId> <artifactId>hutool-all</artifactId> <version>${hutool.version}</version> </dependency> <dependency> <groupId>org.apache.commons</groupId> <artifactId>commons-lang3</artifactId> <version>${commons-lang3.version}</version> </dependency> <dependency> <groupId>commons-codec</groupId> <artifactId>commons-codec</artifactId> <version>1.16.0</version> </dependency> <!-- ========== 数据库依赖 ========== --> <dependency> <groupId>mysql</groupId> <artifactId>mysql-connector-java</artifactId> <version>${mysql.version}</version> </dependency> <dependency> <groupId>com.alibaba</groupId> <artifactId>druid-spring-boot-starter</artifactId> <version>${druid.version}</version> <exclusions> <exclusion> <groupId>com.alibaba</groupId> <artifactId>dubbo</artifactId> </exclusion> <exclusion> <groupId>com.googlecode</groupId> <artifactId>hibernate-memcached</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>org.mybatis.spring.boot</groupId> <artifactId>mybatis-spring-boot-starter</artifactId> <version>${mybatis.version}</version> </dependency> <!-- Oracle驱动 --> <dependency> <groupId>com.oracle.database.jdbc</groupId> <artifactId>ojdbc10</artifactId> <version>19.10.0.0</version> <scope>runtime</scope> </dependency> <dependency> <groupId>com.oracle.database.nls</groupId> <artifactId>orai18n</artifactId> <version>19.10.0.0</version> </dependency> <!-- 人大金仓 --> <dependency> <groupId>com.kingbase8.jdbc</groupId> <artifactId>kingbase8</artifactId> <version>8.6.0</version> <scope>runtime</scope> </dependency> <!-- ========== Flink核心依赖 ========== --> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-clients</artifactId> <version>${flink.version}</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-table-api-java-bridge</artifactId> <version>${flink.version}</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-table-planner_${scala.binary.version}</artifactId> <version>${flink.version}</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-statebackend-rocksdb</artifactId> <version>${flink.version}</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-json</artifactId> <version>${flink.version}</version> <scope>provided</scope> </dependency> <dependency> <groupId>com.google.guava</groupId> <artifactId>guava</artifactId> <version>31.1-jre</version> <scope>provided</scope> </dependency> <!-- ========== CDC连接器(核心修改:排除Infinispan-jdk11依赖) ========== --> <!-- Mysql CDC:排除传递的DebeziumInfinispan问题依赖 --> <dependency> <groupId>com.ververica</groupId> <artifactId>flink-connector-mysql-cdc</artifactId> <exclusions> <exclusion> <groupId>org.apache.flink</groupId> <artifactId>flink-core</artifactId> </exclusion> <exclusion> <groupId>org.apache.flink</groupId> <artifactId>flink-streaming-java</artifactId> </exclusion> <exclusion> <groupId>io.debezium</groupId> <artifactId>*</artifactId> </exclusion> <exclusion> <groupId>ch.qos.logback</groupId> <artifactId>logback-classic</artifactId> </exclusion> <!-- 排除Flink CDC传递的Infinispan问题依赖 --> <exclusion> <groupId>org.infinispan</groupId> <artifactId>infinispan-commons-jdk11</artifactId> </exclusion> </exclusions> </dependency> <!-- Oracle CDC:排除传递的DebeziumInfinispan问题依赖 --> <dependency> <groupId>com.ververica</groupId> <artifactId>flink-connector-oracle-cdc</artifactId> <exclusions> <exclusion> <groupId>org.apache.flink</groupId> <artifactId>flink-core</artifactId> </exclusion> <exclusion> <groupId>io.debezium</groupId> <artifactId>*</artifactId> </exclusion> <exclusion> <groupId>ch.qos.logback</groupId> <artifactId>logback-classic</artifactId> </exclusion> <exclusion> <groupId>org.infinispan</groupId> <artifactId>infinispan-commons-jdk11</artifactId> </exclusion> </exclusions> </dependency> <!-- 人大金仓CDC:排除传递的Debezium依赖 --> <dependency> <groupId>com.kingbase</groupId> <artifactId>flink-sql-cdc-connector-kes-v2</artifactId> <version>3.2-SNAPSHOT</version> <exclusions> <exclusion> <groupId>io.debezium</groupId> <artifactId>*</artifactId> </exclusion> <exclusion> <groupId>org.infinispan</groupId> <artifactId>infinispan-commons-jdk11</artifactId> </exclusion> </exclusions> </dependency> <!-- ========== 显式引入Debezium核心组件(版本由dependencyManagement锁定) ========== --> <dependency> <groupId>io.debezium</groupId> <artifactId>debezium-core</artifactId> </dependency> <dependency> <groupId>io.debezium</groupId> <artifactId>debezium-connector-mysql</artifactId> </dependency> <dependency> <groupId>io.debezium</groupId> <artifactId>debezium-connector-oracle</artifactId> </dependency> <!-- ========== Flink补充依赖 ========== --> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-core</artifactId> <version>${flink.version}</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-streaming-java</artifactId> <version>${flink.version}</version> <exclusions> <exclusion> <groupId>org.apache.commons</groupId> <artifactId>commons-math3</artifactId> </exclusion> </exclusions> <scope>provided</scope> </dependency> <dependency> <groupId>org.apache.commons</groupId> <artifactId>commons-math3</artifactId> <version>3.6.1</version> </dependency> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-connector-base</artifactId> <version>${flink.version}</version> </dependency> <!-- ========== Kafka依赖 ========== --> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-connector-kafka</artifactId> <version>3.2.0-1.19</version> <exclusions> <exclusion> <groupId>org.apache.kafka</groupId> <artifactId>kafka-clients</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>org.apache.kafka</groupId> <artifactId>kafka-clients</artifactId> <version>${kafka-clients.version}</version> </dependency> <dependency> <groupId>org.apache.kafka</groupId> <artifactId>connect-api</artifactId> <version>${kafka-clients.version}</version> <exclusions> <exclusion> <groupId>org.apache.kafka</groupId> <artifactId>kafka-clients</artifactId> </exclusion> <exclusion> <groupId>org.slf4j</groupId> <artifactId>slf4j-api</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>org.apache.kafka</groupId> <artifactId>connect-json</artifactId> <version>${kafka-clients.version}</version> <exclusions> <exclusion> <groupId>org.apache.kafka</groupId> <artifactId>kafka-clients</artifactId> </exclusion> <exclusion> <groupId>org.slf4j</groupId> <artifactId>slf4j-api</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>org.apache.kafka</groupId> <artifactId>connect-transforms</artifactId> <version>${kafka-clients.version}</version> <exclusions> <exclusion> <groupId>org.apache.kafka</groupId> <artifactId>kafka-clients</artifactId> </exclusion> <exclusion> <groupId>org.slf4j</groupId> <artifactId>slf4j-api</artifactId> </exclusion> </exclusions> </dependency> <!-- ========== Doris连接器 ========== --> <dependency> <groupId>org.apache.doris</groupId> <artifactId>flink-doris-connector-1.19</artifactId> <version>25.1.0</version> <exclusions> <exclusion> <groupId>org.slf4j</groupId> <artifactId>slf4j-api</artifactId> </exclusion> <exclusion> <groupId>com.fasterxml.jackson.core</groupId> <artifactId>jackson-databind</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>com.fasterxml.jackson.core</groupId> <artifactId>jackson-databind</artifactId> <version>2.12.7</version> </dependency> <!-- ========== Hadoop支持 ========== --> <dependency> <groupId>org.apache.flink</groupId> <artifactId>flink-hadoop-compatibility_2.12</artifactId> <version>${flink.version}</version> <scope>provided</scope> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-hdfs</artifactId> <version>3.3.6</version> <scope>provided</scope> <exclusions> <exclusion> <groupId>org.slf4j</groupId> <artifactId>slf4j-log4j12</artifactId> </exclusion> <exclusion> <groupId>log4j</groupId> <artifactId>log4j</artifactId> </exclusion> <exclusion> <groupId>org.apache.yetus</groupId> <artifactId>audience-annotations</artifactId> </exclusion> </exclusions> </dependency> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-common</artifactId> <version>3.3.6</version> <scope>provided</scope> <exclusions> <exclusion> <groupId>org.slf4j</groupId> <artifactId>slf4j-log4j12</artifactId> </exclusion> <exclusion> <groupId>log4j</groupId> <artifactId>log4j</artifactId> </exclusion> <exclusion> <groupId>org.apache.yetus</groupId> <artifactId>audience-annotations</artifactId> </exclusion> </exclusions> </dependency> <!-- ========== 其他工具依赖 ========== --> <dependency> <groupId>com.alibaba</groupId> <artifactId>fastjson</artifactId> <version>2.0.39</version> </dependency> <dependency> <groupId>javax.servlet</groupId> <artifactId>javax.servlet-api</artifactId> <version>3.1.0</version> <scope>provided</scope> </dependency> <dependency> <groupId>com.github.pagehelper</groupId> <artifactId>pagehelper</artifactId> <version>5.3.2</version> </dependency> <dependency> <groupId>org.json</groupId> <artifactId>json</artifactId> <version>20231013</version> </dependency> </dependencies> <!-- 环境配置 --> <profiles> <profile> <id>dev</id> <activation> <activeByDefault>false</activeByDefault> </activation> <properties> <profileActive>dev</profileActive> </properties> </profile> <profile> <id>prod</id> <activation> <activeByDefault>true</activeByDefault> </activation> <properties> <profileActive>prod</profileActive> </properties> </profile> </profiles> </project>这是pom配置,这是输出 PS D:\kingbaseProject\crmp-data-syncjob> mvn clean dependency:tree -U -Dincludes="io.debezium:*,org.apache.kafka:*" [INFO] Scanning for projects... [INFO] ------------------------------------------------------------------------ [INFO] Reactor Build Order: [INFO] [INFO] 多数据源同步服务 [pom] [INFO] crmp-data-syncjob-common [jar] [INFO] crmp-data-syncjob-domain [jar] [INFO] crmp-data-syncjob-dao [jar] [INFO] crmp-data-syncjob-service [jar] [INFO] crmp-data-syncjob-web [jar] [INFO] [INFO] -------------------< com.crmp.ecc:crmp-data-syncjob >------------------- [INFO] Building 多数据源同步服务 1.0.0-SNAPSHOT [1/6] [INFO] from pom.xml [INFO] --------------------------------[ pom ]--------------------------------- [INFO] [INFO] --- maven-clean-plugin:3.2.0:clean (default-clean) @ crmp-data-syncjob --- [INFO] [INFO] --- maven-dependency-plugin:3.3.0:tree (default-cli) @ crmp-data-syncjob --- [INFO] com.crmp.ecc:crmp-data-syncjob:pom:1.0.0-SNAPSHOT [INFO] +- io.debezium:debezium-core:jar:1.9.8.Final:compile [INFO] | \- io.debezium:debezium-api:jar:1.9.8.Final:compile [INFO] +- io.debezium:debezium-connector-mysql:jar:1.9.8.Final:compile [INFO] | \- io.debezium:debezium-ddl-parser:jar:1.9.8.Final:compile [INFO] +- io.debezium:debezium-connector-oracle:jar:1.9.8.Final:compile [INFO] +- org.apache.kafka:kafka-clients:jar:2.8.1:compile [INFO] +- org.apache.kafka:connect-api:jar:2.8.1:compile [INFO] +- org.apache.kafka:connect-json:jar:2.8.1:compile [INFO] \- org.apache.kafka:connect-transforms:jar:2.8.1:compile [INFO] [INFO] ---------------< com.crmp.ecc:crmp-data-syncjob-common >---------------- [INFO] Building crmp-data-syncjob-common 1.0.0-SNAPSHOT [2/6] [INFO] from crmp-data-syncjob-common\pom.xml [INFO] --------------------------------[ jar ]--------------------------------- [INFO] [INFO] --- maven-clean-plugin:3.2.0:clean (default-clean) @ crmp-data-syncjob-common --- [INFO] [INFO] --- maven-dependency-plugin:3.3.0:tree (default-cli) @ crmp-data-syncjob-common --- [INFO] com.crmp.ecc:crmp-data-syncjob-common:jar:1.0.0-SNAPSHOT [INFO] +- io.debezium:debezium-core:jar:1.9.8.Final:compile [INFO] | \- io.debezium:debezium-api:jar:1.9.8.Final:compile [INFO] +- io.debezium:debezium-connector-mysql:jar:1.9.8.Final:compile [INFO] | \- io.debezium:debezium-ddl-parser:jar:1.9.8.Final:compile [INFO] +- io.debezium:debezium-connector-oracle:jar:1.9.8.Final:compile [INFO] +- org.apache.kafka:kafka-clients:jar:2.8.1:compile [INFO] +- org.apache.kafka:connect-api:jar:2.8.1:compile [INFO] +- org.apache.kafka:connect-json:jar:2.8.1:compile [INFO] \- org.apache.kafka:connect-transforms:jar:2.8.1:compile [INFO] [INFO] ---------------< com.crmp.ecc:crmp-data-syncjob-domain >---------------- [INFO] Building crmp-data-syncjob-domain 1.0.0-SNAPSHOT [3/6] [INFO] from crmp-data-syncjob-domain\pom.xml [INFO] --------------------------------[ jar ]--------------------------------- [INFO] [INFO] --- maven-clean-plugin:3.2.0:clean (default-clean) @ crmp-data-syncjob-domain --- [INFO] [INFO] --- maven-dependency-plugin:3.3.0:tree (default-cli) @ crmp-data-syncjob-domain --- [INFO] com.crmp.ecc:crmp-data-syncjob-domain:jar:1.0.0-SNAPSHOT [INFO] +- io.debezium:debezium-core:jar:1.9.8.Final:compile [INFO] | \- io.debezium:debezium-api:jar:1.9.8.Final:compile [INFO] +- io.debezium:debezium-connector-mysql:jar:1.9.8.Final:compile [INFO] | \- io.debezium:debezium-ddl-parser:jar:1.9.8.Final:compile [INFO] +- io.debezium:debezium-connector-oracle:jar:1.9.8.Final:compile [INFO] +- org.apache.kafka:kafka-clients:jar:2.8.1:compile [INFO] +- org.apache.kafka:connect-api:jar:2.8.1:compile [INFO] +- org.apache.kafka:connect-json:jar:2.8.1:compile [INFO] \- org.apache.kafka:connect-transforms:jar:2.8.1:compile [INFO] [INFO] -----------------< com.crmp.ecc:crmp-data-syncjob-dao >----------------- [INFO] Building crmp-data-syncjob-dao 1.0.0-SNAPSHOT [4/6] [INFO] from crmp-data-syncjob-dao\pom.xml [INFO] --------------------------------[ jar ]--------------------------------- [INFO] [INFO] --- maven-clean-plugin:3.2.0:clean (default-clean) @ crmp-data-syncjob-dao --- [INFO] [INFO] --- maven-dependency-plugin:3.3.0:tree (default-cli) @ crmp-data-syncjob-dao --- [INFO] com.crmp.ecc:crmp-data-syncjob-dao:jar:1.0.0-SNAPSHOT [INFO] +- io.debezium:debezium-core:jar:1.9.8.Final:compile [INFO] | \- io.debezium:debezium-api:jar:1.9.8.Final:compile [INFO] +- io.debezium:debezium-connector-mysql:jar:1.9.8.Final:compile [INFO] | \- io.debezium:debezium-ddl-parser:jar:1.9.8.Final:compile [INFO] +- io.debezium:debezium-connector-oracle:jar:1.9.8.Final:compile [INFO] +- org.apache.kafka:kafka-clients:jar:2.8.1:compile [INFO] +- org.apache.kafka:connect-api:jar:2.8.1:compile [INFO] +- org.apache.kafka:connect-json:jar:2.8.1:compile [INFO] \- org.apache.kafka:connect-transforms:jar:2.8.1:compile [INFO] [INFO] ---------------< com.crmp.ecc:crmp-data-syncjob-service >--------------- [INFO] Building crmp-data-syncjob-service 1.0.0-SNAPSHOT [5/6] [INFO] from crmp-data-syncjob-service\pom.xml [INFO] --------------------------------[ jar ]--------------------------------- [INFO] [INFO] --- maven-clean-plugin:3.2.0:clean (default-clean) @ crmp-data-syncjob-service --- [INFO] [INFO] --- maven-dependency-plugin:3.3.0:tree (default-cli) @ crmp-data-syncjob-service --- [INFO] com.crmp.ecc:crmp-data-syncjob-service:jar:1.0.0-SNAPSHOT [INFO] +- io.debezium:debezium-core:jar:1.9.8.Final:compile [INFO] | \- io.debezium:debezium-api:jar:1.9.8.Final:compile [INFO] +- io.debezium:debezium-connector-mysql:jar:1.9.8.Final:compile [INFO] | \- io.debezium:debezium-ddl-parser:jar:1.9.8.Final:compile [INFO] +- io.debezium:debezium-connector-oracle:jar:1.9.8.Final:compile [INFO] +- org.apache.kafka:kafka-clients:jar:2.8.1:compile [INFO] +- org.apache.kafka:connect-api:jar:2.8.1:compile [INFO] +- org.apache.kafka:connect-json:jar:2.8.1:compile [INFO] \- org.apache.kafka:connect-transforms:jar:2.8.1:compile [INFO] [INFO] -----------------< com.crmp.ecc:crmp-data-syncjob-web >----------------- [INFO] Building crmp-data-syncjob-web 1.0.0-SNAPSHOT [6/6] [INFO] from crmp-data-syncjob-web\pom.xml [INFO] --------------------------------[ jar ]--------------------------------- [INFO] [INFO] --- maven-clean-plugin:3.2.0:clean (default-clean) @ crmp-data-syncjob-web --- [INFO] [INFO] --- maven-dependency-plugin:3.3.0:tree (default-cli) @ crmp-data-syncjob-web --- [INFO] com.crmp.ecc:crmp-data-syncjob-web:jar:1.0.0-SNAPSHOT [INFO] +- io.debezium:debezium-core:jar:1.9.8.Final:compile [INFO] | \- io.debezium:debezium-api:jar:1.9.8.Final:compile [INFO] +- io.debezium:debezium-connector-mysql:jar:1.9.8.Final:compile [INFO] | \- io.debezium:debezium-ddl-parser:jar:1.9.8.Final:compile [INFO] +- io.debezium:debezium-connector-oracle:jar:1.9.8.Final:compile [INFO] +- org.apache.kafka:kafka-clients:jar:2.8.1:compile [INFO] +- org.apache.kafka:connect-api:jar:2.8.1:compile [INFO] +- org.apache.kafka:connect-json:jar:2.8.1:compile [INFO] \- org.apache.kafka:connect-transforms:jar:2.8.1:compile [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary for 多数据源同步服务 1.0.0-SNAPSHOT: [INFO] [INFO] 多数据源同步服务 ........................................... SUCCESS [ 1.648 s] [INFO] crmp-data-syncjob-common ........................... SUCCESS [ 0.115 s] [INFO] crmp-data-syncjob-domain ........................... SUCCESS [ 0.085 s] [INFO] crmp-data-syncjob-dao .............................. SUCCESS [ 0.131 s] [INFO] crmp-data-syncjob-service .......................... SUCCESS [ 0.113 s] [INFO] crmp-data-syncjob-web .............................. SUCCESS [ 0.121 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 2.683 s [INFO] Finished at: 2025-11-25T15:00:10+08:00 [INFO] ------------------------------------------------------------------------ ,这还是报错 2025-11-25 14:34:23 java.lang.NoSuchMethodError: 'io.debezium.config.Field io.debezium.config.Field.withType(org.apache.kafka.common.config.ConfigDef$Type)' at io.debezium.relational.HistorizedRelationalDatabaseConnectorConfig.<clinit>(HistorizedRelationalDatabaseConnectorConfig.java:48) at io.debezium.relational.RelationalDatabaseConnectorConfig.lambda$new$0(RelationalDatabaseConnectorConfig.java:636) at java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:176) at java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:177) at java.base/java.util.HashMap$KeySpliterator.forEachRemaining(HashMap.java:1810) at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484) at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474) at java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913) at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) at java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578) at io.debezium.config.Configuration$5.keys(Configuration.java:1608) at io.debezium.config.Configuration$5.keys(Configuration.java:1605) at io.debezium.config.Configuration.map(Configuration.java:1567) at io.debezium.config.Configuration.subset(Configuration.java:1553) at io.debezium.relational.RelationalDatabaseConnectorConfig.<init>(RelationalDatabaseConnectorConfig.java:638) at io.debezium.connector.kes.KingbaseConnectorConfig.<init>(KingbaseConnectorConfig.java:1024) at org.apache.flink.cdc.connectors.kes.source.config.KingbaseSourceConfig.getDbzConnectorConfig(KingbaseSourceConfig.java:129) at org.apache.flink.cdc.connectors.kes.source.config.KingbaseSourceConfig.getDbzConnectorConfig(KingbaseSourceConfig.java:35) at org.apache.flink.cdc.connectors.base.config.JdbcSourceConfig.getTableFilters(JdbcSourceConfig.java:165) at org.apache.flink.cdc.connectors.kes.source.KingbaseDialect.isIncludeDataCollection(KingbaseDialect.java:219) at org.apache.flink.cdc.connectors.kes.source.KingbaseDialect.isIncludeDataCollection(KingbaseDialect.java:59) at org.apache.flink.cdc.connectors.base.source.reader.IncrementalSourceReader.addSplits(IncrementalSourceReader.java:266) at org.apache.flink.cdc.connectors.base.source.reader.IncrementalSourceReader.addSplits(IncrementalSourceReader.java:248) at org.apache.flink.streaming.api.operators.SourceOperator.handleAddSplitsEvent(SourceOperator.java:625) at org.apache.flink.streaming.api.operators.SourceOperator.handleOperatorEvent(SourceOperator.java:595) at org.apache.flink.streaming.runtime.tasks.OperatorEventDispatcherImpl.dispatchEventToHandlers(OperatorEventDispatcherImpl.java:72) at org.apache.flink.streaming.runtime.tasks.RegularOperatorChain.dispatchOperatorEvent(RegularOperatorChain.java:80) at org.apache.flink.streaming.runtime.tasks.StreamTask.lambda$dispatchOperatorEvent$22(StreamTask.java:1548) at org.apache.flink.streaming.runtime.tasks.StreamTaskActionExecutor$1.runThrowing(StreamTaskActionExecutor.java:50) at org.apache.flink.streaming.runtime.tasks.mailbox.Mail.run(Mail.java:90) at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMail(MailboxProcessor.java:398) at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.processMailsWhenDefaultActionUnavailable(MailboxProcessor.java:367) at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.processMail(MailboxProcessor.java:352) at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:229) at org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:917) at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:859) at org.apache.flink.runtime.taskmanager.Task.runWithSystemExitMonitoring(Task.java:958) at org.apache.flink.runtime.taskmanager.Task.restoreAndInvoke(Task.java:937) at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:751) at org.apache.flink.runtime.taskmanager.Task.run(Task.java:566) at java.base/java.lang.Thread.run(Thread.java:829) 2025-11-25 14:34:53 java.lang.NoClassDefFoundError: Could not initialize class io.debezium.relational.HistorizedRelationalDatabaseConnectorConfig at io.debezium.relational.RelationalDatabaseConnectorConfig.lambda$new$0(RelationalDatabaseConnectorConfig.java:636) at java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:176) at java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:177) at java.base/java.util.HashMap$KeySpliterator.forEachRemaining(HashMap.java:1810) at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484) at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474) at java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913) at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) at java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578) at io.debezium.config.Configuration$5.keys(Configuration.java:1608) at io.debezium.config.Configuration$5.keys(Configuration.java:1605) at io.debezium.config.Configuration.map(Configuration.java:1567) at io.debezium.config.Configuration.subset(Configuration.java:1553) at io.debezium.relational.RelationalDatabaseConnectorConfig.<init>(RelationalDatabaseConnectorConfig.java:638) at io.debezium.connector.kes.KingbaseConnectorConfig.<init>(KingbaseConnectorConfig.java:1024) at org.apache.flink.cdc.connectors.kes.source.config.KingbaseSourceConfig.getDbzConnectorConfig(KingbaseSourceConfig.java:129) at org.apache.flink.cdc.connectors.kes.source.config.KingbaseSourceConfig.getDbzConnectorConfig(KingbaseSourceConfig.java:35) at org.apache.flink.cdc.connectors.base.config.JdbcSourceConfig.getTableFilters(JdbcSourceConfig.java:165) at org.apache.flink.cdc.connectors.kes.source.KingbaseDialect.isIncludeDataCollection(KingbaseDialect.java:219) at org.apache.flink.cdc.connectors.kes.source.KingbaseDialect.isIncludeDataCollection(KingbaseDialect.java:59) at org.apache.flink.cdc.connectors.base.source.reader.IncrementalSourceReader.addSplits(IncrementalSourceReader.java:266) at org.apache.flink.cdc.connectors.base.source.reader.IncrementalSourceReader.addSplits(IncrementalSourceReader.java:248) at org.apache.flink.streaming.api.operators.SourceOperator.handleAddSplitsEvent(SourceOperator.java:625) at org.apache.flink.streaming.api.operators.SourceOperator.handleOperatorEvent(SourceOperator.java:595) at org.apache.flink.streaming.runtime.tasks.OperatorEventDispatcherImpl.dispatchEventToHandlers(OperatorEventDispatcherImpl.java:72) at org.apache.flink.streaming.runtime.tasks.RegularOperatorChain.dispatchOperatorEvent(RegularOperatorChain.java:80) at org.apache.flink.streaming.runtime.tasks.StreamTask.lambda$dispatchOperatorEvent$22(StreamTask.java:1548) at org.apache.flink.streaming.runtime.tasks.StreamTaskActionExecutor$1.runThrowing(StreamTaskActionExecutor.java:50) at org.apache.flink.streaming.runtime.tasks.mailbox.Mail.run(Mail.java:90) at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMail(MailboxProcessor.java:398) at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.processMailsWhenDefaultActionUnavailable(MailboxProcessor.java:367) at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.processMail(MailboxProcessor.java:352) at org.apache.flink.streaming.runtime.tasks.mailbox.MailboxProcessor.runMailboxLoop(MailboxProcessor.java:229) at org.apache.flink.streaming.runtime.tasks.StreamTask.runMailboxLoop(StreamTask.java:917) at org.apache.flink.streaming.runtime.tasks.StreamTask.invoke(StreamTask.java:859) at org.apache.flink.runtime.taskmanager.Task.runWithSystemExitMonitoring(Task.java:958) at org.apache.flink.runtime.taskmanager.Task.restoreAndInvoke(Task.java:937) at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:751) at org.apache.flink.runtime.taskmanager.Task.run(Task.java:566) at java.base/java.lang.Thread.run(Thread.java:829)
11-26
package com.example.task6; import org.mybatis.spring.annotation.MapperScan; import org.springframework.boot.SpringApplication; import org.springframework.boot.autoconfigure.SpringBootApplication; @MapperScan("com.example.task6.Mapper") @SpringBootApplication public class Task6Application { public static void main(String[] args) { SpringApplication.run(Task6Application.class, args); } } package com.example.task6.Mapper; import com.baomidou.mybatisplus.core.mapper.BaseMapper; import com.example.task6.Entity.Phone; import org.apache.ibatis.annotations.Mapper; @Mapper public interface PhoneMapper extends BaseMapper<Phone> { }package com.example.task6.Mapper; import com.baomidou.mybatisplus.core.mapper.BaseMapper; import com.example.task6.Entity.BasicCard; import org.apache.ibatis.annotations.Mapper; @Mapper public interface BasicCardMapper extends BaseMapper<BasicCard> { }package com.example.task6.Entity; import com.baomidou.mybatisplus.annotation.TableField; import com.baomidou.mybatisplus.annotation.TableName; import lombok.Data; @Data @TableName("phone") // 添加表名注解 public class Phone { private String name; @TableField("ph_number") private String phNumber; }package com.example.task6.Entity; import lombok.Data; import java.util.List; @Data public class CardRequest { private String name; private String email; private String theGroup; private List<String> phNumbers; }package com.example.task6.Entity; import com.baomidou.mybatisplus.annotation.IdType; import com.baomidou.mybatisplus.annotation.TableId; import com.baomidou.mybatisplus.annotation.TableField; import com.baomidou.mybatisplus.annotation.TableName; import lombok.Data; import java.util.List; @Data @TableName("basic_info") public class BasicCard { //@TableId:标识该字段是数据库表的主键 // //type = IdType.INPUT:指定主键生成策略为"用户输入" @TableId(type = IdType.INPUT) // 明确指定name是主键 private String name; private String email; @TableField("the_group") //映射数据库属性 private String theGroup; @TableField(exist = false) //表示非映射数据库的属性 private List<String> phNumbers; }package com.example.task6.Controller; import com.example.task6.Entity.BasicCard; import com.example.task6.Entity.CardRequest; import com.example.task6.Entity.Phone; import com.example.task6.Service.CardOperatingService; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.web.bind.annotation.*; import java.util.List; import java.util.stream.Collectors; @RestController @RequestMapping("/api/cards") public class CardController { private final CardOperatingService cardOperatingService; @Autowired public CardController(CardOperatingService cardOperatingService) { this.cardOperatingService = cardOperatingService; } // 添加名片 @PostMapping public boolean addCard(@RequestBody CardRequest request) { BasicCard card = new BasicCard(); card.setName(request.getName()); card.setEmail(request.getEmail()); card.setTheGroup(request.getTheGroup()); List<Phone> phones = request.getPhNumbers().stream() .map(num -> { Phone p = new Phone(); p.setName(request.getName()); p.setPhNumber(num); return p; }).collect(Collectors.toList()); return cardOperatingService.addCard(card, phones); } // 删除名片 @DeleteMapping("/{name}") public boolean deleteCard(@PathVariable String name) { return cardOperatingService.deleteCard(name); } // 更新名片 @PutMapping public boolean updateCard( @RequestBody CardRequest request) { BasicCard card = new BasicCard(); card.setName(request.getName()); card.setEmail(request.getEmail()); card.setTheGroup(request.getTheGroup()); return cardOperatingService.updateCard(card, request.getPhNumbers()); } // 模糊查询 @GetMapping("/search") public List<BasicCard> search(@RequestParam String keyword) { return cardOperatingService.searchByName(keyword); } // 组合查询 @GetMapping("/search/phones") public List<BasicCard> searchByPhones(@RequestParam List<String> numbers) { return cardOperatingService.searchByPhones(numbers); } // 排序 @GetMapping("/sort") public List<BasicCard> sort( @RequestParam String field, @RequestParam boolean asc) { return cardOperatingService.sortCards(field, asc); } // 获取完整名片 @GetMapping("/{name}") public BasicCard getCard(@PathVariable String name) { return cardOperatingService.getFullCard(name); } }spring.application.name=task6 spring.datasource.type=com.alibaba.druid.pool.DruidDataSource spring.datasource.driver-class-name=com.mysql.jdbc.Driver spring.datasource.url=jdbc:mysql://localhost:3306/task6?useSSL=false spring.datasource.username=root spring.datasource.password=cjzl22446 mybatis-plus.configuration.log-impl=org.apache.ibatis.logging.stdout.StdOutImpl </licenses> <developers> <developer/> </developers> <scm> <connection/> <developerConnection/> <tag/> <url/> </scm> <properties> <java.version>21</java.version> </properties> <dependencies> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-thymeleaf</artifactId> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-web</artifactId> </dependency> <dependency> <groupId>org.projectlombok</groupId> <artifactId>lombok</artifactId> <optional>true</optional> </dependency> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-test</artifactId> <scope>test</scope> </dependency> <dependency> <groupId>com.baomidou</groupId> <artifactId>mybatis-plus-spring-boot3-starter</artifactId> <version>3.5.7</version> </dependency> <dependency> <groupId>mysql</groupId> <artifactId>mysql-connector-java</artifactId> <version>8.0.33</version> </dependency> <dependency> <groupId>com.alibaba</groupId> <artifactId>druid-spring-boot-starter</artifactId> <version>1.2.22</version> </dependency> </dependencies> <build> <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-compiler-plugin</artifactId> <configuration> <annotationProcessorPaths> <path> <groupId>org.projectlombok</groupId> <artifactId>lombok</artifactId> </path> </annotationProcessorPaths> </configuration> </plugin> <plugin> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-maven-plugin</artifactId> <configuration> <excludes> <exclude> <groupId>org.projectlombok</groupId> <artifactId>lombok</artifactId> </exclude> </excludes> </configuration> </plugin> </plugins> </build> </project> 这是全部代码,找出jar包报错原因11:42:27.854 [main] ERROR org.springframework.boot.SpringApplication -- Application run failed org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'cardController' defined in URL [jar:file:/C:/Users/17586/IdeaProjects/task6/out/artifacts/task6_jar/task6.jar!/com/example/task6/Controller/CardController.class]: Unsatisfied dependency expressed through constructor parameter 0: Error creating bean with name 'cardOperatingService' defined in URL [jar:file:/C:/Users/17586/IdeaProjects/task6/out/artifacts/task6_jar/task6.jar!/com/example/task6/Service/CardOperatingService.class]: Unsatisfied dependency expressed through constructor parameter 0: Error creating bean with name 'basicCardMapper' defined in URL [jar:file:/C:/Users/17586/IdeaProjects/task6/out/artifacts/task6_jar/task6.jar!/com/example/task6/Mapper/BasicCardMapper.class]: Property 'sqlSessionFactory' or 'sqlSessionTemplate' are required at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:804) at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:240) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1395) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1232) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:569) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:529) at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:339) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:373) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:337) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:202) at org.springframework.beans.factory.support.DefaultListableBeanFactory.instantiateSingleton(DefaultListableBeanFactory.java:1222) at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingleton(DefaultListableBeanFactory.java:1188) at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:1123) at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:987) at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:627) at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:753) at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:439) at org.springframework.boot.SpringApplication.run(SpringApplication.java:318) at org.springframework.boot.SpringApplication.run(SpringApplication.java:1362) at org.springframework.boot.SpringApplication.run(SpringApplication.java:1351) at com.example.task6.Task6Application.main(Task6Application.java:11) Caused by: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'cardOperatingService' defined in URL [jar:file:/C:/Users/17586/IdeaProjects/task6/out/artifacts/task6_jar/task6.jar!/com/example/task6/Service/CardOperatingService.class]: Unsatisfied dependency expressed through constructor parameter 0: Error creating bean with name 'basicCardMapper' defined in URL [jar:file:/C:/Users/17586/IdeaProjects/task6/out/artifacts/task6_jar/task6.jar!/com/example/task6/Mapper/BasicCardMapper.class]: Property 'sqlSessionFactory' or 'sqlSessionTemplate' are required at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:804) at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:240) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1395) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1232) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:569) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:529) at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:339) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:373) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:337) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:202) at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1682) at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1628) at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:913) at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:791) ... 20 common frames omitted Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'basicCardMapper' defined in URL [jar:file:/C:/Users/17586/IdeaProjects/task6/out/artifacts/task6_jar/task6.jar!/com/example/task6/Mapper/BasicCardMapper.class]: Property 'sqlSessionFactory' or 'sqlSessionTemplate' are required at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1826) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:607) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:529) at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:339) at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:373) at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:337) at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:202) at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1682) at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1628) at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:913) at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:791) ... 33 common frames omitted Caused by: java.lang.IllegalArgumentException: Property 'sqlSessionFactory' or 'sqlSessionTemplate' are required at org.springframework.util.Assert.notNull(Assert.java:181) at org.mybatis.spring.support.SqlSessionDaoSupport.checkDaoConfig(SqlSessionDaoSupport.java:125) at org.mybatis.spring.mapper.MapperFactoryBean.checkDaoConfig(MapperFactoryBean.java:73) at org.springframework.dao.support.DaoSupport.afterPropertiesSet(DaoSupport.java:44) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1873) at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1822) ... 43 common frames omitted
06-22
AI 代码审查Review工具 是一个旨在自动化代码审查流程的工具。它通过集成版本控制系统(如 GitHub GitLab)的 Webhook,利用大型语言模型(LLM)对代码变更进行分析,并将审查意见反馈到相应的 Pull Request 或 Merge Request 中。此外,它还支持将审查结果通知到企业微信等通讯工具。 一个基于 LLM 的自动化代码审查助手。通过 GitHub/GitLab Webhook 监听 PR/MR 变更,调用 AI 分析代码,并将审查意见自动评论到 PR/MR,同时支持多种通知渠道。 主要功能 多平台支持: 集成 GitHub GitLab Webhook,监听 Pull Request / Merge Request 事件。 智能审查模式: 详细审查 (/github_webhook, /gitlab_webhook): AI 对每个变更文件进行分析,旨在找出具体问题。审查意见会以结构化的形式(例如,定位到特定代码行、问题分类、严重程度、分析建议)逐条评论到 PR/MR。AI 模型会输出 JSON 格式的分析结果,系统再将其转换为多条独立的评论。 通用审查 (/github_webhook_general, /gitlab_webhook_general): AI 对每个变更文件进行整体性分析,并为每个文件生成一个 Markdown 格式的总结性评论。 自动化流程: 自动将 AI 审查意见(详细模式下为多条,通用模式下为每个文件一条)发布到 PR/MR。 在所有文件审查完毕后,自动在 PR/MR 中发布一条总结性评论。 即便 AI 未发现任何值得报告的问题,也会发布相应的友好提示总结评论。 异步处理审查任务,快速响应 Webhook。 通过 Redis 防止对同一 Commit 的重复审查。 灵活配置: 通过环境变量设置基
【直流微电网】径向直流微电网的状态空间建模与线性化:一种耦合DC-DC变换器状态空间平均模型的方法 (Matlab代码实现)内容概要:本文介绍了径向直流微电网的状态空间建模与线性化方法,重点提出了一种基于耦合DC-DC变换器的状态空间平均模型的建模策略。该方法通过数学建模手段对直流微电网系统进行精确的状态空间描述,并对其进行线性化处理,以便于系统稳定性分析与控制器设计。文中结合Matlab代码实现,展示了建模与仿真过程,有助于研究人员理解复现相关技术,推动直流微电网系统的动态性能研究与工程应用。; 适合人群:具备电力电子、电力系统或自动化等相关背景,熟悉Matlab/Simulink仿真工具,从事新能源、微电网或智能电网研究的研究生、科研人员及工程技术人员。; 使用场景及目标:①掌握直流微电网的动态建模方法;②学习DC-DC变换器在耦合条件下的状态空间平均建模技巧;③实现系统的线性化分析并支持后续控制器设计(如电压稳定控制、功率分配等);④为科研论文撰写、项目仿真验证提供技术支持与代码参考。; 阅读建议:建议读者结合Matlab代码逐步实践建模流程,重点关注状态变量选取、平均化处理线性化推导过程,同时可扩展应用于更复杂的直流微电网拓扑结构中,提升系统分析与设计能力。
评论
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值