Middle-题目122:220. Contains Duplicate III

本文介绍了一种使用TreeSet解决寻找接近重复项问题的方法。通过维护一个长度为k的TreeSet,利用其有序特性来判断是否存在两个数值之间的差值不超过t且下标差不超过k的情况。

题目原文:
Given an array of integers, find out whether there are two distinct indices i and j in the array such that the difference between nums[i] and nums[j] is at most t and the difference between i and j is at most k.
题目大意:
给出一个数组nums,判断是否存在这样两个下标i和j使得|nums[i]-nums[j]≤t,且|j-i|≤k.
题目分析:
巧用TreeSet这个类,它是一个集合,元素不可重复,不可维护加入集合的顺序,但集合是有序的,有floor(n)和ceiling(n)两个方法取得元素n相邻的两个元素。
那么构造一个TreeSet,并维持TreeSet的长度为K(如果i超过k就把nums[i-k]弹出去)。然后设n=nums[i].每次判断n-floor(n)是否≤t或者ceiling(n)是否≤t(因为TreeSet的长度不超过k,所以如果存在floor(n)或者ceiling(n),其在数组的下标与n的差值必然不超过k),若存在则找到解,返回true,若不存在则加入n,弹出nums[i-k],总的时间复杂度是O(nlogk) (因为TreeSet的add和remove都是logk复杂度的)
源码:(language:java)

public class Solution {
  public boolean containsNearbyAlmostDuplicate(int[] nums, int k, int t) {
    if(k < 1 || t < 0)
      return false;
    TreeSet<Integer> set = new TreeSet<>();
    for(int i = 0; i < nums.length; i++){
      int n = nums[i];
      if(set.floor(n) != null && n <= t + set.floor(n) || 
          set.ceiling(n) != null && set.ceiling(n) <= t + n)
        return true;
      set.add(n);
      if (i >= k)
        set.remove(nums[i - k]);
    }
    return false;
  }
}

成绩:
68ms,beats 10.72%,众数23ms,5.88%
Cmershen的碎碎念:
本题类似于滑动窗口,是第一道需要用到TreeSet的题,但是效率不高,可能有O(n)的线性算法。TreeSet基于红黑树实现,它遗失插入顺序但保证集合内有序,而红黑树是一种应用广泛的用于查找的二叉树,关于红黑树的详细了解可自行百度。类似的数据结构还有TreeMap(保证key有序)。

import dash from dash import dcc, html from dash.dependencies import Input, Output, State import plotly.graph_objects as go import json import pandas as pd import os import numpy as np import math from shapely.geometry import Point, Polygon from shapely.ops import unary_union # ---------------------------------------------------------------------- # 1. 配置参数读取 # ---------------------------------------------------------------------- CONFIG_FILE = "simulation_parameter_case.json" try: with open(CONFIG_FILE, 'r') as f: config = json.load(f) isd = config["channel_parameters"]["isd"] display_cell_ids = [bs["cell_id"] for bs in config["bs_parameters"]] except Exception as e: print(f"Error: {e} - Using default values") isd = 200 display_cell_ids = [] # ---------------------------------------------------------------------- # 2. 7c3s小区拓扑类定义(添加精确四边形扇区) # ---------------------------------------------------------------------- class CellTopology: def __init__(self, isd=200): self.isd = isd self.radius = (self.isd / 2.0) / (np.cos(np.deg2rad(30))) # 基站位置 self.bs_loc_set = np.array([ [0.0, 0.0], # 中心基站 [-math.sqrt(3) * self.isd / 2.0, self.isd / 2.0], # 左上 [0.0, self.isd], # 上 [math.sqrt(3) * self.isd / 2.0, self.isd / 2.0], # 右上 [math.sqrt(3) * self.isd / 2.0, -self.isd / 2.0], # 右下 [0.0, -self.isd], # 下 [-math.sqrt(3) * self.isd / 2.0, -self.isd / 2.0] # 左下 ]) # 六边形顶点 self.center_bs_hexgon_vertex = np.array([ [self.radius, 0.0], [self.radius / 2.0, self.isd / 2.0], [-self.radius / 2.0, self.isd / 2.0], [-self.radius, 0.0], [-self.radius / 2.0, -self.isd / 2.0], [self.radius / 2.0, -self.isd / 2.0], [self.radius, 0.0] ]) # 扇区分割线 self.center_sector_split_line = np.array([ [0.0, 0.0], [self.radius, 0.0], # 0° [0.0, 0.0], [-self.radius / 2.0, self.isd / 2.0], # 120° [0.0, 0.0], [-self.radius / 2.0, -self.isd / 2.0] # 240° ]) # 存储每个扇区的四边形多边形 self.sector_polygons = {} current_cell_id = 0 # 扇区角度定义 sector_angles = [0, 120, 240] for bs_id in range(7): bs_loc = self.bs_loc_set[bs_id] hex_vertices = self.center_bs_hexgon_vertex + bs_loc # 为每个扇区创建四边形多边形 for i, angle in enumerate(sector_angles): # 扇区起始点和结束点 start_idx = i end_idx = (i + 1) % 3 # 扇区四边形顶点:基站中心 + 三个六边形顶点 quad_points = np.array([ bs_loc, # 基站中心 hex_vertices[start_idx * 2], hex_vertices[start_idx * 2 + 1], hex_vertices[end_idx * 2] ]) self.sector_polygons[current_cell_id] = quad_points current_cell_id += 1 # 使用配置文件中的ISD值初始化拓扑 topology = CellTopology(isd=isd) # ---------------------------------------------------------------------- # 3. UE轨迹工具初始设置 # ---------------------------------------------------------------------- JSON_FILE = "user_specific_parameter.json" # 初始化JSON文件 def initialize_json_file(): if not os.path.exists(JSON_FILE) or os.stat(JSON_FILE).st_size == 0: default_data = { "ue_trajectories": { "trajectory_interval_second": 1, "interpolation_interval_second": 0.01, "trajectory_list": [], "isd": isd, # 添加ISD信息 "display_cell_ids": display_cell_ids # 添加显示的cell_id信息 } } with open(JSON_FILE, 'w') as f: json.dump(default_data, f, indent=4) else: try: with open(JSON_FILE, 'r') as f: data = json.load(f) if "ue_trajectories" in data: # 更新当前ISD和显示的cell_id data["ue_trajectories"]["isd"] = isd data["ue_trajectories"]["display_cell_ids"] = display_cell_ids data["ue_trajectories"]["trajectory_list"] = [] else: data["ue_trajectories"] = { "trajectory_interval_second": 1, "interpolation_interval_second": 0.01, "trajectory_list": [], "isd": isd, "display_cell_ids": display_cell_ids } with open(JSON_FILE, 'w') as f: json.dump(data, f, indent=4) except Exception as e: print(f"初始化JSON文件错误: {e}") default_data = { "ue_trajectories": { "trajectory_interval_second": 1, "interpolation_interval_second": 0.01, "trajectory_list": [], "isd": isd, "display_cell_ids": display_cell_ids } } with open(JSON_FILE, 'w') as f: json.dump(default_data, f, indent=4) # 初始化JSON文件 initialize_json_file() # ---------------------------------------------------------------------- # 4. 背景点生成函数(只生成指定扇区内的点) # ---------------------------------------------------------------------- def generate_background_points(display_cell_ids, density=5): """只为显示的扇区生成背景点""" display_polygons = [] # 获取所有要显示扇区的多边形 for cell_id in display_cell_ids: if cell_id in topology.sector_polygons: quad_points = topology.sector_polygons[cell_id] display_polygons.append(Polygon(quad_points)) if not display_polygons: return [], [] # 合并所有多边形 combined_poly = unary_union(display_polygons) # 计算边界 min_x, min_y, max_x, max_y = combined_poly.bounds min_x -= 10 max_x += 10 min_y -= 10 max_y += 10 # 生成网格点 x_coords = np.linspace(min_x, max_x, int((max_x - min_x) * density)) y_coords = np.linspace(min_y, max_y, int((max_y - min_y) * density)) points_x = [] points_y = [] # 筛选在合并多边形内的点 for x in x_coords: for y in y_coords: p = Point(x, y) if combined_poly.contains(p): points_x.append(x) points_y.append(y) return points_x, points_y # 生成指定扇区的背景点 BG_POINTS_X, BG_POINTS_Y = generate_background_points(display_cell_ids, density=2) # ---------------------------------------------------------------------- # 5. 小区拓扑绘图函数(只绘制指定四边形扇区) # ---------------------------------------------------------------------- def create_initial_figure(all_trajectories_data=[]): fig = go.Figure() # 1. 只绘制指定的四边形扇区 for cell_id in display_cell_ids: if cell_id in topology.sector_polygons: quad_points = topology.sector_polygons[cell_id] # 闭合多边形(第一个点添加到末尾) x_quad = list(quad_points[:, 0]) + [quad_points[0, 0]] y_quad = list(quad_points[:, 1]) + [quad_points[0, 1]] # 添加四边形扇区边界 fig.add_trace(go.Scatter( x=x_quad, y=y_quad, mode='lines', line=dict(color='green', width=2), name=f'Cell {cell_id}', hoverinfo='skip', showlegend=False )) # 2. 添加扇区ID标签 labels_x = [] labels_y = [] labels_text = [] for cell_id in display_cell_ids: if cell_id in topology.sector_polygons: quad_points = topology.sector_polygons[cell_id] # 标签位置:四边形中心点 center_x = np.mean(quad_points[:, 0]) center_y = np.mean(quad_points[:, 1]) labels_x.append(center_x) labels_y.append(center_y) labels_text.append(f"Cell {cell_id}") fig.add_trace(go.Scatter( x=labels_x, y=labels_y, mode='text', text=labels_text, textfont=dict(size=12, color='black'), textposition='middle center', hoverinfo='skip', showlegend=False )) # 3. 添加透明背景点 fig.add_trace(go.Scatter( x=BG_POINTS_X, y=BG_POINTS_Y, mode='markers', marker=dict(size=5, color='rgba(0,0,0,0)', opacity=0), name='Click Area', hoverinfo='none', customdata=[[x, y] for x, y in zip(BG_POINTS_X, BG_POINTS_Y)], unselected=dict(marker={'opacity': 0}), selected=dict(marker={'color': 'rgba(255, 0, 0, 0.5)', 'opacity': 0.5, 'size': 8}), showlegend=False )) # 4. 设置图表布局 axis_range = isd * 1.5 fig.update_layout( title=f'Selected Cells: {display_cell_ids} - UE #{len(all_trajectories_data) + 1}', xaxis_title='X Coordinate (m)', yaxis_title='Y Coordinate (m)', xaxis=dict(range=[-axis_range, axis_range]), yaxis=dict(range=[-axis_range, axis_range], scaleanchor='x', scaleratio=1), dragmode='select', template='plotly_white', clickmode='event+select', height=800 ) return fig # ---------------------------------------------------------------------- # 6. Dash应用布局 # ---------------------------------------------------------------------- app = dash.Dash(__name__) app.layout = html.Div([ html.H1(f"Cell Visualization Tool (ISD={isd}m)"), html.Div([ html.P(f"Displayed Cells: {display_cell_ids}", style={'font-weight': 'bold', 'margin-top': '10px'}) ]), dcc.Graph( id='topology-graph', figure=create_initial_figure(), config={'displayModeBar': True}, style={'height': '80vh', 'width': '100%'} ), html.Div(id='selected-data-output'), html.Div([ html.Button('Save Current UE Trajectory', id='save-clear-button', n_clicks=0, style={'margin-right': '10px', 'padding': '10px', 'background-color': '#4CAF50', 'color': 'white'}), html.Button('Export to JSON', id='export-json-button', n_clicks=0, style={'padding': '10px', 'background-color': '#008CBA', 'color': 'white'}), ], style={'margin': '20px 0'}), dcc.Store(id='current-ue-store', data=[]), dcc.Store(id='all-trajectories-store', data=[]), dcc.Store(id='last-point-coords', data={'x': None, 'y': None}) ]) # ---------------------------------------------------------------------- # 7. Dash回调函数 # ---------------------------------------------------------------------- @app.callback( [Output('current-ue-store', 'data', allow_duplicate=True), Output('last-point-coords', 'data', allow_duplicate=True)], [Input('topology-graph', 'selectedData')], [State('current-ue-store', 'data'), State('topology-graph', 'figure')], # 添加对图形状态的依赖 prevent_initial_call=True ) def handle_graph_select(selectedData, current_ue_data, figure): if selectedData is None or not selectedData['points']: return dash.no_update, dash.no_update last_point = selectedData['points'][-1] # 动态计算背景点轨迹的曲线编号 background_curve_number = len(display_cell_ids) + 1 # 四边形扇区 + 标签轨迹 # 确认选择来自"点击区域"轨迹 if last_point.get('curveNumber') != background_curve_number: return dash.no_update, dash.no_update x = last_point['x'] y = last_point['y'] z = 1.5 new_point = [round(x, 4), round(y, 4), z] if current_ue_data and new_point == current_ue_data[-1]: return dash.no_update, dash.no_update updated_trajectory = current_ue_data + [new_point] return updated_trajectory, {'x': round(x, 4), 'y': round(y, 4)} @app.callback( [Output('topology-graph', 'figure', allow_duplicate=True), Output('selected-data-output', 'children')], [Input('current-ue-store', 'data'), Input('last-point-coords', 'data')], [State('all-trajectories-store', 'data')], prevent_initial_call=True ) def update_graph_and_display(current_ue_data, last_point_coords, all_trajectories_data): fig = create_initial_figure(all_trajectories_data) if current_ue_data: df = pd.DataFrame(current_ue_data, columns=['x', 'y', 'z']) fig.add_trace(go.Scatter( x=df['x'], y=df['y'], mode='lines+markers', marker=dict(size=8, color='red', symbol='circle'), line=dict(color='red', width=2), name='UE Trajectory', hoverinfo='text', text=[f'({x}, {y})' for x, y in zip(df['x'], df['y'])], showlegend=True )) if last_point_coords['x'] is not None: display_text = f"Last Point: X={last_point_coords['x']}, Y={last_point_coords['y']}, Height=1.5m" else: display_text = "Drag/click on the chart area to draw UE trajectory..." return fig, display_text @app.callback( [Output('current-ue-store', 'data', allow_duplicate=True), Output('all-trajectories-store', 'data', allow_duplicate=True)], [Input('save-clear-button', 'n_clicks')], [State('current-ue-store', 'data'), State('all-trajectories-store', 'data')], prevent_initial_call=True ) def save_and_clear_trajectory(n_clicks, current_ue_data, all_trajectories_data): if n_clicks > 0: if not current_ue_data: return [], all_trajectories_data updated_all_trajectories = all_trajectories_data + [current_ue_data] new_current_ue_data = [] return new_current_ue_data, updated_all_trajectories return dash.no_update, dash.no_update @app.callback( Output('export-json-button', 'children'), [Input('export-json-button', 'n_clicks')], [State('all-trajectories-store', 'data')], prevent_initial_call=True ) def export_to_json(n_clicks, all_trajectories_data): if n_clicks > 0: try: with open(JSON_FILE, 'r') as f: data = json.load(f) data["ue_trajectories"]["trajectory_list"] = all_trajectories_data with open(JSON_FILE, 'w') as f: json.dump(data, f, indent=4) num_trajectories = len(all_trajectories_data) return f"Exported {num_trajectories} trajectories to {JSON_FILE}" except Exception: return "Export failed!" return 'Export All Trajectories to JSON File' # ---------------------------------------------------------------------- # 7. 运行应用 # ---------------------------------------------------------------------- if __name__ == '__main__': print(f"Please visit http://127.0.0.1:8050/") app.run(debug=True) 将上面代码中绘制小区拓扑图时,xy坐标范围改为自适应,使绘制的小区拓扑图始终保持占满整个页面
11-01
时间限制:1000ms 内存限制:512MB 输入文件名:s.in 输出文件名:s.out 题目描述: 给定一个长度为 N 的只含小写字母的字符串 S,每次操作可以选择一个位置 1 ≤ x ≤ N 和一个小写字母 c,接着把 Sₓ 改为 c。 你需要操作若干次,使得操作结束后,每相邻 K 个字符都不同,求最少的操作次数。保证 1 ≤ K ≤ 13。 输入格式: 第一行一个整数 T,表示测试数据组数。 接下来 T 组测试数据,对于每组测试数据: 第一行两个整数 N, K,分别表示字符串长度和子串的长度。 第二行一个长度为 N 的字符串 S。 输出格式: 对于每组测试数据,输出一行一个整数表示最少的操作次数。 样例输入 1(input1): 5 6 3 abaaba 5 2 hooch 8 3 cherykid 9 3 abbabbaba 9 3 aabbaabba 样例输出 1(output1): 2 1 0 3 4 样例 #1 解释: 对于第一组数据,可以用 2 次操作把 S 改为 abcabc,每相邻三个字符组成的字符串分别是 abc、bca、cab、abc,都满足包含的字符互不相同。 对于第三组数据,请注意可以不进行操作。 数据范围: 对于所有数据,保证 1 ≤ T ≤ 10⁵,1 ≤ ∑N ≤ 10⁶,1 ≤ K ≤ 13,S 中只含有小写字符。 子任务说明: 子任务 1:∑N ≤ 10,K ≤ 10,分值 10 子任务 2:∑N ≤ 100,K ≤ 10,分值 10 子任务 3:∑N ≤ 10³,K ≤ 13,分值 10 子任务 4:∑N ≤ 10⁵,K ≤ 3,分值 20 子任务 5:∑N ≤ 10⁵,K ≤ 13,分值 20 子任务 6:∑N ≤ 10⁶,K ≤ 13,分值 30 用C++ 14 With O2的语言写一段代码,不要超时,不要注释,用万能头,变量名用一个字母表示,最终代码写好后随便写5组边缘样例测试一下刚刚写的代码(把代码放进编译器输出检查),然后根据刚刚的代码测试修改代码,将这两段代码里的变量名用一个字母替代。代码加上空格,使代码有可读性(实在拿不到全分,也可以拿部分分,尽量多拿!)注意加上文件读写!!!!!!!!!!
07-12
【电力系统】单机无穷大电力系统短路故障暂态稳定Simulink仿真(带说明文档)内容概要:本文档围绕“单机无穷大电力系统短路故障暂态稳定Simulink仿真”展开,提供了完整的仿真模型与说明文档,重点研究电力系统在发生短路故障后的暂态稳定性问题。通过Simulink搭建单机无穷大系统模型,模拟不同类型的短路故障(如三相短路),分析系统在故障期间及切除后的动态响应,包括发电机转子角度、转速、电压和功率等关键参数的变化,进而评估系统的暂态稳定能力。该仿真有助于理解电力系统稳定性机理,掌握暂态过程分析方法。; 适合人群:电气工程及相关专业的本科生、研究生,以及从事电力系统分析、运行与控制工作的科研人员和工程师。; 使用场景及目标:①学习电力系统暂态稳定的基本概念与分析方法;②掌握利用Simulink进行电力系统建模与仿真的技能;③研究短路故障对系统稳定性的影响及提高稳定性的措施(如故障清除时间优化);④辅助课程设计、毕业设计或科研项目中的系统仿真验证。; 阅读建议:建议结合电力系统稳定性理论知识进行学习,先理解仿真模型各模块的功能与参数设置,再运行仿真并仔细分析输出结果,尝试改变故障类型或系统参数以观察其对稳定性的影响,从而深化对暂态稳定问题的理解。
本研究聚焦于运用MATLAB平台,将支持向量机(SVM)应用于数据预测任务,并引入粒子群优化(PSO)算法对模型的关键参数进行自动调优。该研究属于机器学习领域的典型实践,其核心在于利用SVM构建分类模型,同时借助PSO的全局搜索能力,高效确定SVM的最优超参数配置,从而显著增强模型的整体预测效能。 支持向量机作为一种经典的监督学习方法,其基本原理是通过在高维特征空间中构造一个具有最大间隔的决策边界,以实现对样本数据的分类或回归分析。该算法擅长处理小规模样本集、非线性关系以及高维度特征识别问题,其有效性源于通过核函数将原始数据映射至更高维的空间,使得原本复杂的分类问题变得线性可分。 粒子群优化算法是一种模拟鸟群社会行为的群体智能优化技术。在该算法框架下,每个潜在解被视作一个“粒子”,粒子群在解空间中协同搜索,通过不断迭代更新自身速度与位置,并参考个体历史最优解和群体全局最优解的信息,逐步逼近问题的最优解。在本应用中,PSO被专门用于搜寻SVM中影响模型性能的两个关键参数——正则化参数C与核函数参数γ的最优组合。 项目所提供的实现代码涵盖了从数据加载、预处理(如标准化处理)、基础SVM模型构建到PSO优化流程的完整步骤。优化过程会针对不同的核函数(例如线性核、多项式核及径向基函数核等)进行参数寻优,并系统评估优化前后模型性能的差异。性能对比通常基于准确率、精确率、召回率及F1分数等多项分类指标展开,从而定量验证PSO算法在提升SVM模型分类能力方面的实际效果。 本研究通过一个具体的MATLAB实现案例,旨在演示如何将全局优化算法与机器学习模型相结合,以解决模型参数选择这一关键问题。通过此实践,研究者不仅能够深入理解SVM的工作原理,还能掌握利用智能优化技术提升模型泛化性能的有效方法,这对于机器学习在实际问题中的应用具有重要的参考价值。 资源来源于网络分享,仅用于学习交流使用,请勿用于商业,如有侵权请联系我删除!
PS D:\kingbaseProject\crmp-data-syncjob> mvn dependency:tree -Dincludes="io.debezium" -Dverbose [INFO] Scanning for projects... [INFO] ------------------------------------------------------------------------ [INFO] Reactor Build Order: [INFO] [INFO] 多数据源同步服务 [pom] [INFO] crmp-data-syncjob-common [jar] [INFO] crmp-data-syncjob-domain [jar] [INFO] crmp-data-syncjob-dao [jar] [INFO] crmp-data-syncjob-service [jar] [INFO] crmp-data-syncjob-web [jar] [INFO] [INFO] -------------------< com.crmp.ecc:crmp-data-syncjob >------------------- [INFO] Building 多数据源同步服务 1.0.0-SNAPSHOT [1/6] [INFO] from pom.xml [INFO] --------------------------------[ pom ]--------------------------------- [INFO] [INFO] --- maven-dependency-plugin:3.3.0:tree (default-cli) @ crmp-data-syncjob --- Downloading from alimaven: http://maven.aliyun.com/nexus/content/groups/public/com/dameng/DmJdbcDriver18/8.1.3.62/DmJdbcDriver18-8.1.3.62.pom Downloaded from alimaven: http://maven.aliyun.com/nexus/content/groups/public/com/dameng/DmJdbcDriver18/8.1.3.62/DmJdbcDriver18-8.1.3.62.pom (973 B at 2.3 kB/s) [WARNING] The POM for org.infinispan:infinispan-commons-jdk11:jar:13.0.20.Final is missing, no dependency information available [INFO] com.crmp.ecc:crmp-data-syncjob:pom:1.0.0-SNAPSHOT [INFO] +- com.ververica:flink-connector-mysql-cdc:jar:3.2.0:compile [INFO] | \- org.apache.flink:flink-connector-debezium:jar:3.2.0:compile (scope not updated to compile) [INFO] | +- (io.debezium:debezium-api:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] | \- io.debezium:debezium-embedded:jar:1.9.8.Final:compile (version managed from 1.9.8.Final) [INFO] +- io.debezium:debezium-core:jar:1.9.8.Final:compile [INFO] | \- io.debezium:debezium-api:jar:1.9.8.Final:compile (version managed from 1.9.8.Final) [INFO] +- io.debezium:debezium-connector-mysql:jar:1.9.8.Final:compile [INFO] | +- (io.debezium:debezium-core:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] | \- io.debezium:debezium-ddl-parser:jar:1.9.8.Final:compile (version managed from 1.9.8.Final) [INFO] | \- (io.debezium:debezium-core:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] \- io.debezium:debezium-connector-oracle:jar:1.9.8.Final:compile [INFO] +- (io.debezium:debezium-core:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] \- (io.debezium:debezium-ddl-parser:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] [INFO] ---------------< com.crmp.ecc:crmp-data-syncjob-common >---------------- [INFO] Building crmp-data-syncjob-common 1.0.0-SNAPSHOT [2/6] [INFO] from crmp-data-syncjob-common\pom.xml [INFO] --------------------------------[ jar ]--------------------------------- [INFO] [INFO] --- maven-dependency-plugin:3.3.0:tree (default-cli) @ crmp-data-syncjob-common --- [INFO] com.crmp.ecc:crmp-data-syncjob-common:jar:1.0.0-SNAPSHOT [INFO] +- com.ververica:flink-connector-mysql-cdc:jar:3.2.0:compile [INFO] | \- org.apache.flink:flink-connector-debezium:jar:3.2.0:compile (scope not updated to compile) [INFO] | +- (io.debezium:debezium-api:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] | \- io.debezium:debezium-embedded:jar:1.9.8.Final:compile (version managed from 1.9.8.Final) [INFO] +- io.debezium:debezium-core:jar:1.9.8.Final:compile [INFO] | \- io.debezium:debezium-api:jar:1.9.8.Final:compile (version managed from 1.9.8.Final) [INFO] +- io.debezium:debezium-connector-mysql:jar:1.9.8.Final:compile [INFO] | +- (io.debezium:debezium-core:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] | \- io.debezium:debezium-ddl-parser:jar:1.9.8.Final:compile (version managed from 1.9.8.Final) [INFO] | \- (io.debezium:debezium-core:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] \- io.debezium:debezium-connector-oracle:jar:1.9.8.Final:compile [INFO] +- (io.debezium:debezium-core:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] \- (io.debezium:debezium-ddl-parser:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] [INFO] ---------------< com.crmp.ecc:crmp-data-syncjob-domain >---------------- [INFO] Building crmp-data-syncjob-domain 1.0.0-SNAPSHOT [3/6] [INFO] from crmp-data-syncjob-domain\pom.xml [INFO] --------------------------------[ jar ]--------------------------------- [INFO] [INFO] --- maven-dependency-plugin:3.3.0:tree (default-cli) @ crmp-data-syncjob-domain --- [INFO] com.crmp.ecc:crmp-data-syncjob-domain:jar:1.0.0-SNAPSHOT [INFO] +- com.ververica:flink-connector-mysql-cdc:jar:3.2.0:compile [INFO] | \- org.apache.flink:flink-connector-debezium:jar:3.2.0:compile (scope not updated to compile) [INFO] | +- (io.debezium:debezium-api:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] | \- io.debezium:debezium-embedded:jar:1.9.8.Final:compile (version managed from 1.9.8.Final) [INFO] +- io.debezium:debezium-core:jar:1.9.8.Final:compile [INFO] | \- io.debezium:debezium-api:jar:1.9.8.Final:compile (version managed from 1.9.8.Final) [INFO] +- io.debezium:debezium-connector-mysql:jar:1.9.8.Final:compile [INFO] | +- (io.debezium:debezium-core:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] | \- io.debezium:debezium-ddl-parser:jar:1.9.8.Final:compile (version managed from 1.9.8.Final) [INFO] | \- (io.debezium:debezium-core:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] \- io.debezium:debezium-connector-oracle:jar:1.9.8.Final:compile [INFO] +- (io.debezium:debezium-core:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] \- (io.debezium:debezium-ddl-parser:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] [INFO] -----------------< com.crmp.ecc:crmp-data-syncjob-dao >----------------- [INFO] Building crmp-data-syncjob-dao 1.0.0-SNAPSHOT [4/6] [INFO] from crmp-data-syncjob-dao\pom.xml [INFO] --------------------------------[ jar ]--------------------------------- [INFO] [INFO] --- maven-dependency-plugin:3.3.0:tree (default-cli) @ crmp-data-syncjob-dao --- [INFO] com.crmp.ecc:crmp-data-syncjob-dao:jar:1.0.0-SNAPSHOT [INFO] +- com.crmp.ecc:crmp-data-syncjob-domain:jar:1.0.0-SNAPSHOT:compile [INFO] | +- (io.debezium:debezium-core:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] | +- (io.debezium:debezium-connector-mysql:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] | \- (io.debezium:debezium-connector-oracle:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] +- com.ververica:flink-connector-mysql-cdc:jar:3.2.0:compile [INFO] | \- org.apache.flink:flink-connector-debezium:jar:3.2.0:compile (scope not updated to compile) [INFO] | +- (io.debezium:debezium-api:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] | \- io.debezium:debezium-embedded:jar:1.9.8.Final:compile (version managed from 1.9.8.Final) [INFO] +- io.debezium:debezium-core:jar:1.9.8.Final:compile [INFO] | \- io.debezium:debezium-api:jar:1.9.8.Final:compile (version managed from 1.9.8.Final) [INFO] +- io.debezium:debezium-connector-mysql:jar:1.9.8.Final:compile [INFO] | +- (io.debezium:debezium-core:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] | \- io.debezium:debezium-ddl-parser:jar:1.9.8.Final:compile (version managed from 1.9.8.Final) [INFO] | \- (io.debezium:debezium-core:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] \- io.debezium:debezium-connector-oracle:jar:1.9.8.Final:compile [INFO] +- (io.debezium:debezium-core:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] \- (io.debezium:debezium-ddl-parser:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] [INFO] ---------------< com.crmp.ecc:crmp-data-syncjob-service >--------------- [INFO] Building crmp-data-syncjob-service 1.0.0-SNAPSHOT [5/6] [INFO] from crmp-data-syncjob-service\pom.xml [INFO] --------------------------------[ jar ]--------------------------------- [INFO] [INFO] --- maven-dependency-plugin:3.3.0:tree (default-cli) @ crmp-data-syncjob-service --- [INFO] com.crmp.ecc:crmp-data-syncjob-service:jar:1.0.0-SNAPSHOT [INFO] +- com.crmp.ecc:crmp-data-syncjob-common:jar:1.0.0-SNAPSHOT:compile [INFO] | +- (io.debezium:debezium-core:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] | +- (io.debezium:debezium-connector-mysql:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] | \- (io.debezium:debezium-connector-oracle:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] +- com.crmp.ecc:crmp-data-syncjob-domain:jar:1.0.0-SNAPSHOT:compile [INFO] | +- (io.debezium:debezium-core:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] | +- (io.debezium:debezium-connector-mysql:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] | \- (io.debezium:debezium-connector-oracle:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] +- com.crmp.ecc:crmp-data-syncjob-dao:jar:1.0.0-SNAPSHOT:compile [INFO] | +- (io.debezium:debezium-core:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] | +- (io.debezium:debezium-connector-mysql:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] | \- (io.debezium:debezium-connector-oracle:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] +- io.debezium:debezium-core:jar:1.9.8.Final:compile [INFO] | \- io.debezium:debezium-api:jar:1.9.8.Final:compile (version managed from 1.9.8.Final) [INFO] +- io.debezium:debezium-connector-mysql:jar:1.9.8.Final:compile [INFO] | +- (io.debezium:debezium-core:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] | \- io.debezium:debezium-ddl-parser:jar:1.9.8.Final:compile (version managed from 1.9.8.Final) [INFO] | \- (io.debezium:debezium-core:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] \- io.debezium:debezium-connector-oracle:jar:1.9.8.Final:compile [INFO] +- (io.debezium:debezium-core:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] \- (io.debezium:debezium-ddl-parser:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] [INFO] -----------------< com.crmp.ecc:crmp-data-syncjob-web >----------------- [INFO] Building crmp-data-syncjob-web 1.0.0-SNAPSHOT [6/6] [INFO] from crmp-data-syncjob-web\pom.xml [INFO] --------------------------------[ jar ]--------------------------------- [INFO] [INFO] --- maven-dependency-plugin:3.3.0:tree (default-cli) @ crmp-data-syncjob-web --- [INFO] com.crmp.ecc:crmp-data-syncjob-web:jar:1.0.0-SNAPSHOT [INFO] +- com.crmp.ecc:crmp-data-syncjob-service:jar:1.0.0-SNAPSHOT:compile [INFO] | +- com.crmp.ecc:crmp-data-syncjob-common:jar:1.0.0-SNAPSHOT:compile (version managed from 1.0.0-SNAPSHOT) [INFO] | | +- (io.debezium:debezium-core:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] | | +- (io.debezium:debezium-connector-mysql:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] | | \- (io.debezium:debezium-connector-oracle:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] | +- com.crmp.ecc:crmp-data-syncjob-domain:jar:1.0.0-SNAPSHOT:compile (version managed from 1.0.0-SNAPSHOT) [INFO] | | +- (io.debezium:debezium-core:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] | | +- (io.debezium:debezium-connector-mysql:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] | | \- (io.debezium:debezium-connector-oracle:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] | +- com.crmp.ecc:crmp-data-syncjob-dao:jar:1.0.0-SNAPSHOT:compile (version managed from 1.0.0-SNAPSHOT) [INFO] | | +- (io.debezium:debezium-core:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] | | +- (io.debezium:debezium-connector-mysql:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] | | \- (io.debezium:debezium-connector-oracle:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] | +- (io.debezium:debezium-core:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] | +- (io.debezium:debezium-connector-mysql:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] | \- (io.debezium:debezium-connector-oracle:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] +- com.ververica:flink-connector-mysql-cdc:jar:3.2.0:compile [INFO] | \- org.apache.flink:flink-connector-debezium:jar:3.2.0:compile (scope not updated to compile) [INFO] | +- (io.debezium:debezium-api:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] | \- io.debezium:debezium-embedded:jar:1.9.8.Final:compile (version managed from 1.9.8.Final) [INFO] +- io.debezium:debezium-core:jar:1.9.8.Final:compile [INFO] | \- io.debezium:debezium-api:jar:1.9.8.Final:compile (version managed from 1.9.8.Final) [INFO] +- io.debezium:debezium-connector-mysql:jar:1.9.8.Final:compile [INFO] | +- (io.debezium:debezium-core:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] | \- io.debezium:debezium-ddl-parser:jar:1.9.8.Final:compile (version managed from 1.9.8.Final) [INFO] | \- (io.debezium:debezium-core:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] \- io.debezium:debezium-connector-oracle:jar:1.9.8.Final:compile [INFO] +- (io.debezium:debezium-core:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] \- (io.debezium:debezium-ddl-parser:jar:1.9.8.Final:compile - version managed from 1.9.8.Final; omitted for duplicate) [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary for 多数据源同步服务 1.0.0-SNAPSHOT: [INFO] [INFO] 多数据源同步服务 ........................................... SUCCESS [ 3.292 s] [INFO] crmp-data-syncjob-common ........................... SUCCESS [ 0.207 s] [INFO] crmp-data-syncjob-domain ........................... SUCCESS [ 0.175 s] [INFO] crmp-data-syncjob-dao .............................. SUCCESS [ 0.193 s] [INFO] crmp-data-syncjob-service .......................... SUCCESS [ 0.161 s] [INFO] crmp-data-syncjob-web .............................. SUCCESS [ 0.214 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 4.669 s [INFO] Finished at: 2025-11-25T11:15:55+08:00 [INFO] ------------------------------------------------------------------------ 请仔细检查,是否除了1.9.8.Final还有其他debezium
11-26
评论
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值