Undefined index: HTTP_RAW_POST_DATA的解决办法

本文介绍了一种改进的方法来获取PHP中的原始POST数据。通过使用条件操作符和file_get_contents函数,可以更灵活地处理HTTP_RAW_POST_DATA环境变量,确保即使在某些服务器配置下也能正确读取输入流。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >


$postStr = $GLOBALS["HTTP_RAW_POST_DATA"];

替换为


$postStr = isset($GLOBALS['HTTP_RAW_POST_DATA']) ? $GLOBALS['HTTP_RAW_POST_DATA'] : file_get_contents("php://input");

即可。
如下代码,有一个BUG。 我的眼图PASS区域顶上和底下夹住PASS区域的Y轴范围,向X轴方向,有两条虚线,这两条我不需要。请帮忙去除。只去掉多的两根横线。眼图中间找中心点的辅助线仍然保留。我勾勒原始数据边界线的闭合曲线仍然要保留,只是说这个功能多产生了两条横线。 有一个新需求。请输入日志文件路径:,这个位置,如果是输入文件夹,就批量处理文件夹下面所有的.txt或者.log文件。如果输入的路径是到文件名,就只处理这一个文件。 import numpy as np “”" NumPy (Numerical Python) - 科学计算基础库 主要功能: 提供高效的N维数组对象(ndarray) 支持广播功能函数 提供线性代数、傅里叶变换、随机数生成等功能 在本代码中主要用于: 数值计算和数组操作 科学计算支持 “”" import matplotlib.pyplot as plt “”" Matplotlib - Python中最强大的绘图库 主要功能: 创建静态、动态和交互式图表 支持多种图表类型(线图、散点图、柱状图等) 高度可定制化(颜色、线型、标签等) 在本代码中主要用于: 绘制眼图 可视化DDR校准数据 创建图表和图形界面 “”" import re “”" re (Regular Expression) - 正则表达式模块 主要功能: 文本搜索和模式匹配 文本替换 复杂字符串处理 在本代码中主要用于: 解析日志文件中的关键数据 提取VREF、偏移量、数据点等信息 处理复杂的文本匹配任务 “”" import datetime “”" datetime - 日期和时间处理模块 主要功能: 日期和时间的表示 日期和时间的计算 日期和时间的格式化 在本代码中主要用于: 生成时间戳 创建带时间戳的文件名 记录报告生成时间 “”" from matplotlib.lines import Line2D “”" Line2D - 用于创建二维线条对象 主要功能: 表示二维坐标系中的线条 控制线条属性(颜色、线宽、样式等) 在本代码中主要用于: 创建图例元素 自定义图表中的线条样式 “”" import os “”" os (Operating System) - 操作系统接口模块 主要功能: 文件和目录操作 路径操作 进程管理 在本代码中主要用于: 文件路径处理 目录创建 文件存在性检查 “”" from collections import defaultdict “”" defaultdict - 带默认值的字典 主要功能: 当访问不存在的键时返回默认值 避免KeyError异常 在本代码中主要用于: 存储电压-窗口映射关系 简化字典初始化操作 “”" import math “”" math - 数学函数模块 主要功能: 提供数学运算函数 包括三角函数、对数、取整等 在本代码中新增用于: 计算中位数 “”" 新旧日志格式识别标志 OLD_LOG_FORMAT = 0 NEW_LOG_FORMAT = 1 def detect_log_format(log_content): “”“自动检测日志格式”“” # 检测新日志的时间戳模式 if re.search(r’ ParseError: KaTeX parse error: Undefined control sequence: \d at position 1: \̲d̲{4}-\d{2}-\d{2}… ‘, log_content): return NEW_LOG_FORMAT # 检测旧日志的特征行 elif re.search(r’NOTICE:\s+Booting’, log_content): return OLD_LOG_FORMAT # 默认作为新日志处理 return NEW_LOG_FORMAT 健壮的文件读取函数 - 详细解释每个编程概念 def robust_read_file(file_path): “”" 健壮的文件读取函数,处理不同编码的文件 参数: file_path - 文件在电脑上的完整路径(字符串) 编程概念详解: 1. 函数定义:def关键字用于定义函数,函数是一段可重复使用的代码块 2. 参数传递:file_path是形式参数,调用时传入实际文件路径 3. 异常处理:try-except结构用于捕获和处理运行时错误 4. 上下文管理器:with语句用于资源管理,确保文件正确关闭 5. 编码处理:不同文件可能使用不同编码(UTF-8, Latin-1等) 6. 正则表达式:用于过滤控制字符 """ ########################################################## # 增强:支持多种编码格式,特别是Tera Term日志 ########################################################## # 尝试的编码列表(针对Windows环境) encodings = ['utf-8', 'latin-1', 'cp1252', 'gbk', 'big5', 'shift_jis', 'utf-16le'] for encoding in encodings: try: with open(file_path, 'rb') as f: raw_content = f.read() # 尝试解码 content = raw_content.decode(encoding, errors='replace') # 移除ANSI转义序列 (例如: \x1b[32m 颜色代码) ansi_escape = re.compile(r'\x1B(?:[@-Z\\-_]|\[[0-?]*[ -/]*[@-~])') content = ansi_escape.sub('', content) # 移除其他控制字符 (保留换行符) content = re.sub(r'[\x00-\x09\x0B-\x1F\x7F]', '', content) # 修复常见的乱码模式(特定于Tera Term) content = re.sub(r'[^\x20-\x7E\r\n\u4e00-\u9FFF]', ' ', content) return content except UnicodeDecodeError: continue # 回退方案:使用二进制读取并过滤 try: with open(file_path, 'rb') as f: raw_content = f.read() # 使用错误替换并移除不可打印字符 return raw_content.decode('utf-8', errors='replace').encode('utf-8', errors='replace').decode('utf-8', errors='ignore') except Exception as e: print(f"严重错误: 无法读取文件 {file_path}: {e}") return None 日志解析函数 - 重点讲解正则表达式 def parse_log_file(log_content, normalization_point): “”" 解析DDR校准日志文件,提取关键数据 参数: log_content - 日志文件的内容(字符串) normalization_point - 归一化点(十六进制整数) 数据结构说明: data = { vref: { dq_index: { 'read': (min, max, window), 'write': (min, max, window) } } } raw_data = { vref: { dq_index: { 'read': {'min': min_val, 'max': max_val}, 'write': {'min': min_val, 'max': max_val} } } } """ # 检测日志格式并调用相应的解析函数 log_format = detect_log_format(log_content) if log_format == OLD_LOG_FORMAT: print("检测到旧日志格式") return parse_old_log(log_content, normalization_point) else: print("检测到新日志格式(Tera Term)") return parse_tera_term_log(log_content, normalization_point) def parse_old_log(log_content, normalization_point): “”" 解析旧日志格式(原始格式) 参数: log_content - 日志文件的内容(字符串) normalization_point - 归一化点(十六进制整数) “”" # 初始化数据结构 data = {} # 主数据结构,存储解析后的数据 current_vref = None # 当前处理的vref值 pending_data = {} # 临时存储待处理的数据(字典) current_offset = None # 当前偏移量 raw_data = {} # 存储原始数据(偏移前) # 按行处理日志内容 # 字符串方法:split('\n') 按换行符分割字符串 for line in log_content.split('\n'): # 字符串方法:strip() 移除首尾空白字符 line = line.strip() # 空行检查 if not line: continue # 跳过空行 ########################################################## # 正则表达式1:匹配VREF行 # 模式:r'.*vref:\s*0x([0-9a-fA-F]+)' # 目标示例: "Setting vref: 0x1A3" # # 详细分解: # .* - 匹配任意字符(除换行符外)0次或多次(贪婪匹配) # vref: - 匹配字面字符串 "vref:" # \s* - 匹配0个或多个空白字符(空格、制表符等) # 0x - 匹配字面字符串 "0x" # ( - 开始捕获组 # [0-9a-fA-F] - 字符类,匹配十六进制字符(0-9, a-f, A-F) # + - 匹配前面的元素1次或多次 # ) - 结束捕获组 # # 匹配过程: # "Setting vref: 0x1A3" -> 匹配整个字符串 # 捕获组1: "1A3" ########################################################## vref_match = re.match(r'.*vref:\s*0x([0-9a-fA-F]+)', line) if vref_match: # 获取捕获组内容 hex_str = vref_match.group(1) # int()函数:字符串转整数 # 参数1:字符串 # 参数2:基数(16表示十六进制) current_vref = int(hex_str, 16) # 字典初始化 data[current_vref] = {} # 嵌套字典初始化 raw_data[current_vref] = {} pending_data = {} # 重置临时数据 current_offset = None continue # 跳过后续处理 ########################################################## # 正则表达式2:匹配偏移量行 # 模式:r'.*0x38c:\s*(?:0x)?([0-9a-fA-F]+)' # 目标示例: "Offset 0x38c: 0x25" 或 "0x38c: 25" # # 详细分解: # .* - 匹配任意字符0次或多次 # 0x38c: - 匹配字面字符串 "0x38c:" # \s* - 匹配0个或多个空白字符 # (?: - 开始非捕获组 # 0x - 匹配字面字符串 "0x" # )? - 非捕获组出现0次或1次 # ( - 开始捕获组 # [0-9a-fA-F]+ - 匹配1个或多个十六进制字符 # ) - 结束捕获组 # # 特殊说明: # (?:...) 是非捕获组,匹配但不捕获内容 # 用于处理可选前缀而不创建额外捕获组 ########################################################## offset_match = re.match(r'.*0x38c\s*:\s*(?:0x)?([0-9a-fA-F]+)', line) if offset_match and current_vref is not None: try: hex_str = offset_match.group(1) offset_value = int(hex_str, 16) # 计算偏移量:归一化点 - 读取值 current_offset = normalization_point - offset_value except ValueError: # 异常处理:打印警告信息 print(f"警告: 无法解析偏移量: {offset_match.group(1)}") current_offset = None continue ########################################################## # 正则表达式3:匹配最大值点 # 模式:r'.*dq(\d+)\s+max_(\w+)_point\s*:\s*(-?\d+)' # 目标示例: "dq5 max_read_point: 120" # # 详细分解: # .* - 匹配任意字符0次或多次 # dq - 匹配字面字符串 "dq" # (\d+) - 捕获组1:匹配1个或多个数字(DQ索引) # \s+ - 匹配1个或多个空白字符 # max_ - 匹配字面字符串 "max_" # (\w+) - 捕获组2:匹配1个或多个单词字符(方向:read/write) # _point - 匹配字面字符串 "point" # \s*:\s* - 匹配冒号前后任意空白 # (-?\d+) - 捕获组3:匹配可选负号后跟1个或多个数字 # # 捕获组说明: # 组1: DQ索引 (如 "5") # 组2: 方向 (如 "read") # 组3: 最大值 (如 "120") ########################################################## max_match = re.match(r'.*dq(\d+)\s*max_(\w+)_point\s*:\s*(-?\d+)', line) if max_match and current_vref is not None: # 提取捕获组内容 dq_index = int(max_match.group(1)) # 转换为整数 direction = max_match.group(2) # 字符串 max_val = int(max_match.group(3)) # 转换为整数 # 字典操作:检查键是否存在并初始化 if current_vref not in raw_data: # 字典设置默认值 raw_data[current_vref] = {} if dq_index not in raw_data[current_vref]: raw_data[current_vref][dq_index] = {} if direction not in raw_data[current_vref][dq_index]: # 嵌套字典初始化 raw_data[current_vref][dq_index][direction] = {'min': None, 'max': None} # 存储原始值(不应用偏移) raw_data[current_vref][dq_index][direction]['max'] = max_val # 只有读方向应用偏移 if direction == 'read' and current_offset is not None: # 应用偏移 max_val += current_offset # 存储到临时数据字典 key = (dq_index, direction) # 元组作为字典键 if key not in pending_data: pending_data[key] = {} pending_data[key]['max'] = max_val # 字典值也是字典 continue ########################################################## # 正则表达式4:匹配最小值点(结构类似最大值匹配) # 模式:r'.*dq(\d+)\s+min_(\w+)_point\s*:\s*(-?\d+)' # 目标示例: "dq5 min_read_point: 32" ########################################################## min_match = re.match(r'.*dq(\d+)\s*min_(\w+)_point\s*:\s*(-?\d+)', line) if min_match and current_vref is not None: dq_index = int(min_match.group(1)) direction = min_match.group(2) min_val = int(min_match.group(3)) key = (dq_index, direction) # 存储原始值(类似最大值处理) if current_vref not in raw_data: raw_data[current_vref] = {} if dq_index not in raw_data[current_vref]: raw_data[current_vref][dq_index] = {} if direction not in raw_data[current_vref][dq_index]: raw_data[current_vref][dq_index][direction] = {'min': None, 'max': None} raw_data[current_vref][dq_index][direction]['min'] = min_val # 只有读方向应用偏移 if direction == 'read' and current_offset is not None: min_val += current_offset # 更新临时数据 if key in pending_data: # 字典更新操作 pending_data[key]['min'] = min_val else: pending_data[key] = {'min': min_val} continue ########################################################## # 正则表达式5:匹配窗口行 # 模式:r'.*dq(\d+)\s+(\w+)_windows\s*:\s*(-?\d+)' # 目标示例: "dq5 read_windows: 88" # # 详细分解: # .* - 匹配任意字符0次或多次 # dq - 匹配字面字符串 "dq" # (\d+) - 捕获组1:匹配1个或多个数字(DQ索引) # \s+ - 匹配1个或多个空白字符 # (\w+) - 捕获组2:匹配1个或多个单词字符(方向) # _windows - 匹配字面字符串 "_windows" # \s*:\s* - 匹配冒号前后任意空白 # (-?\d+) - 捕获组3:匹配可选负号后跟1个或多个数字 ########################################################## win_match = re.match(r'.*dq(\d+)\s*(\w+)_windows\s*:\s*(-?\d+)', line) if win_match and current_vref is not None: dq_index = int(win_match.group(1)) direction = win_match.group(2) windows = int(win_match.group(3)) key = (dq_index, direction) # 检查是否已收集最小值和最大值 if key in pending_data and 'min' in pending_data[key] and 'max' in pending_data[key]: min_val = pending_data[key]['min'] max_val = pending_data[key]['max'] # 确定最大延迟值(读0x7F=127,写0xFF=255) max_delay = 0x7F if direction == 'read' else 0xFF # 确保值在有效范围内 min_val = max(0, min_val) # 最小值不小于0 max_val = min(max_delay, max_val) # 最大值不超过最大延迟 # 检查数据有效性 if min_val > max_val or windows < 0: result = None # 无效数据 else: # 计算窗口大小 window_size = max_val - min_val + 1 result = (min_val, max_val, window_size) # 存储到最终数据结构 if dq_index not in data[current_vref]: # 初始化嵌套字典 data[current_vref][dq_index] = {} data[current_vref][dq_index][direction] = result # 从临时数据中移除 del pending_data[key] # 删除字典键 # 返回解析结果 return data, raw_data def parse_tera_term_log(log_content, normalization_point): “”" 解析Tera Term生成的新日志格式 参数: log_content - 日志文件的内容(字符串) normalization_point - 归一化点(十六进制整数) “”" # 初始化数据结构 data = {} current_vref = None pending_data = {} current_offset = None raw_data = {} # 预处理:移除行首时间戳 (例如: [2025-07-31 21:26:46.357]) log_content = re.sub(r'$$\d{4}-\d{2}-\d{2}\s\d{2}:\d{2}:\d{2}\.\d{3}$$\s*', '', log_content) for line in log_content.split('\n'): line = line.strip() if not line: continue # 增强的VREF匹配 - 容忍乱码前缀 vref_match = re.search(r'vref\D*0x([0-9a-fA-F]{1,3})', line, re.IGNORECASE) if vref_match: try: hex_str = vref_match.group(1) current_vref = int(hex_str, 16) data[current_vref] = {} raw_data[current_vref] = {} pending_data = {} current_offset = None except ValueError: print(f"警告: 无法解析VREF值: {hex_str}") continue # 增强的偏移量匹配 offset_match = re.search(r'0x38c\D*([0-9a-fA-F]{2})', line, re.IGNORECASE) if offset_match and current_vref is not None: try: hex_str = offset_match.group(1) offset_value = int(hex_str, 16) current_offset = normalization_point - offset_value except (ValueError, TypeError): print(f"警告: 无法解析偏移量: {offset_match.group(1)}") current_offset = None continue # 增强的数据点匹配 - 容忍乱码和格式变化 # 匹配模式: dqX max_Y_point: value point_pattern = r'dq(\d+)\s*(max|min)\s*_(\w+)_point\s*[:=]\s*(-?\d+)' point_match = re.search(point_pattern, line, re.IGNORECASE) if point_match and current_vref is not None: try: dq_index = int(point_match.group(1)) point_type = point_match.group(2).lower() # 'max' or 'min' direction = point_match.group(3).lower() # 'read' or 'write' value = int(point_match.group(4)) # 初始化数据结构 if current_vref not in raw_data: raw_data[current_vref] = {} if dq_index not in raw_data[current_vref]: raw_data[current_vref][dq_index] = {} if direction not in raw_data[current_vref][dq_index]: raw_data[current_vref][dq_index][direction] = {'min': None, 'max': None} # 存储原始值 if point_type == 'max': raw_data[current_vref][dq_index][direction]['max'] = value else: # 'min' raw_data[current_vref][dq_index][direction]['min'] = value # 应用偏移(仅读方向) if direction == 'read' and current_offset is not None: value += current_offset # 存储到临时数据 key = (dq_index, direction) if key not in pending_data: pending_data[key] = {} pending_data[key][point_type] = value except (ValueError, IndexError) as e: print(f"解析数据点时出错: {line} -> {e}") continue # 增强的窗口匹配 win_pattern = r'dq(\d+)\s*(\w+)\s*_windows\s*[:=]\s*(-?\d+)' win_match = re.search(win_pattern, line, re.IGNORECASE) if win_match and current_vref is not None: try: dq_index = int(win_match.group(1)) direction = win_match.group(2).lower() windows = int(win_match.group(3)) key = (dq_index, direction) if key in pending_data and 'min' in pending_data[key] and 'max' in pending_data[key]: min_val = pending_data[key]['min'] max_val = pending_data[key]['max'] # 确定最大延迟值 max_delay = 0x7F if direction == 'read' else 0xFF # 确保值在有效范围内 min_val = max(0, min_val) max_val = min(max_delay, max_val) if min_val > max_val or windows < 0: result = None else: window_size = max_val - min_val + 1 result = (min_val, max_val, window_size) # 存储到最终数据结构 if dq_index not in data[current_vref]: data[current_vref][dq_index] = {} data[current_vref][dq_index][direction] = result # 从临时数据中移除 del pending_data[key] except (ValueError, KeyError) as e: print(f"解析窗口时出错: {line} -> {e}") return data, raw_data 眼图指标计算函数 - 算法详解(修改后) def calculate_eye_metrics(data, avddq, dq_index, direction): “”" 计算眼图的最大宽度、最大高度以及中心点 参数: data - 解析后的日志数据(字典结构) avddq - AVDDQ电压值(浮点数) dq_index - DQ索引(0-15,整数) direction - 方向(‘read’或’write’,字符串) 算法说明: 1. 遍历所有VREF值 2. 计算实际电压 = (vref / 0x1FF) * avddq 3. 获取当前DQ和方向的数据 4. 计算窗口大小(UI单位) 5. 确定最大眼宽(所有窗口中的最大值) 6. 计算最大眼高(连续电压范围的最大高度) 7. 计算眼图中心点(最大眼高和最大眼宽的交点) """ # 初始化变量 max_eye_width = 0.0 max_eye_height = 0.0 # 存储每个电压对应的窗口大小(用于计算眼高) voltage_windows = defaultdict(float) # 存储每个电压对应的延迟范围(用于计算眼宽) voltage_delay_ranges = {} # 存储每个延迟位置对应的电压范围(用于计算眼高) delay_voltage_ranges = defaultdict(list) # 确定最大延迟值(读0x7F=127,写0xFF=255) max_delay = 0x7F if direction == 'read' else 0xFF # 确定UI范围(读2UI,写4UI) ui_range = 2 if direction == 'read' else 4 # 遍历所有VREF值 for vref, dq_data in data.items(): # 计算实际电压 # 0x1FF = 511(9位最大值) voltage = (vref / 0x1FF) * avddq # 字典安全访问:get()方法 # 避免KeyError异常 dq_info = dq_data.get(dq_index, {}).get(direction) if dq_info is None: continue # 跳过无数据项 # 解包元组 min_point, max_point, window_size = dq_info # 重新计算窗口大小(确保正确性) window_size = max_point - min_point + 1 # 计算窗口大小(UI单位) window_ui = (window_size / max_delay) * ui_range # 更新最大眼宽 if window_ui > max_eye_width: max_eye_width = window_ui # 存储电压-窗口映射 voltage_windows[voltage] = window_ui # 存储电压-延迟范围映射(用于计算眼宽) voltage_delay_ranges[voltage] = (min_point, max_point) # 存储延迟位置对应的电压范围(用于计算眼高) for delay in range(min_point, max_point + 1): delay_voltage_ranges[delay].append(voltage) # 计算最大眼高(连续电压范围) # 步骤: # 1. 对电压排序 # 2. 遍历排序后的电压 # 3. 计算连续有效窗口的电压范围 sorted_voltages = sorted(voltage_windows.keys()) # 排序电压值 current_height = 0 # 当前连续高度 max_height = 0 # 最大高度 # 遍历排序后的电压(从第二个元素开始) for i in range(1, len(sorted_voltages)): # 计算电压差 voltage_diff = sorted_voltages[i] - sorted_voltages[i-1] # 检查相邻电压点是否都有有效窗口 # 字典键存在性检查 if sorted_voltages[i] in voltage_windows and sorted_voltages[i-1] in voltage_windows: current_height += voltage_diff if current_height > max_height: max_height = current_height else: current_height = 0 # 重置高度计数器 max_eye_height = max_height # 计算最大眼宽对应的延迟位置(新增) # 找到具有最大窗口的电压点 best_voltage = None max_window_ui = 0 for voltage, window_ui in voltage_windows.items(): if window_ui > max_window_ui: max_window_ui = window_ui best_voltage = voltage # 计算最大眼高对应的延迟位置(新增) # 找到具有最宽电压范围的延迟位置 best_delay = None max_voltage_range = 0 for delay, voltages in delay_voltage_ranges.items(): if voltages: min_v = min(voltages) max_v = max(voltages) voltage_range = max_v - min_v if voltage_range > max_voltage_range: max_voltage_range = voltage_range best_delay = delay # 计算眼图中心点 center_ui = None center_voltage = None if best_delay is not None and best_voltage is not None: # 将延迟转换为UI单位 center_ui = (best_delay / max_delay) * ui_range center_voltage = best_voltage # 返回计算结果 return max_eye_width, max_eye_height, center_ui, center_voltage, best_delay, best_voltage 眼图数据生成函数 - 详细解释算法 def generate_eye_diagram(data, avddq, ui_ps, dq_index, direction): “”" 生成眼图数据点 参数: data - 解析后的日志数据(字典) avddq - AVDDQ电压值(浮点数) ui_ps - 每个UI的时间(皮秒) dq_index - DQ索引(0-15,整数) direction - 方向(‘read’或’write’,字符串) 算法说明: 1. 遍历所有VREF值 2. 计算实际电压 = (vref / 0x1FF) * avddq 3. 遍历所有可能的延迟值 4. 将延迟值转换为UI单位 5. 根据数据有效性标记为通过点或失败点 """ pass_points = [] # 存储通过点(绿色) fail_points = [] # 存储失败点(红色) # 确定最大延迟值(读0x7F=127,写0xFF=255) max_delay = 0x7F if direction == 'read' else 0xFF # 确定UI范围(读2UI,写4UI) ui_range = 2 if direction == 'read' else 4 # 遍历所有VREF值 for vref, dq_data in data.items(): # 计算实际电压 voltage = (vref / 0x1FF) * avddq # 获取当前DQ和方向的数据 dq_info = dq_data.get(dq_index, {}).get(direction) # 遍历所有可能的延迟值 for delay in range(0, max_delay + 1): # 将延迟值转换为UI单位 ui_value = (delay / max_delay) * ui_range # 如果没有有效数据,标记为失败点 if dq_info is None: fail_points.append((ui_value, voltage)) else: # 解包元组 min_point, max_point, _ = dq_info # 检查当前延迟是否在有效范围内 if min_point <= delay <= max_point: pass_points.append((ui_value, voltage)) else: fail_points.append((ui_value, voltage)) return pass_points, fail_points 眼图数据生成函数 - 增加边界计算功能 def generate_eye_diagram(data, raw_data, avddq, ui_ps, dq_index, direction): “”" 生成眼图数据点和边界 参数: data - 解析后的日志数据(字典) raw_data - 原始数据(偏移前) avddq - AVDDQ电压值(浮点数) ui_ps - 每个UI的时间(皮秒) dq_index - DQ索引(0-15,整数) direction - 方向(‘read’或’write’,字符串) 返回: pass_points - 通过点列表 fail_points - 失败点列表 adjusted_boundary - 调整后PASS区域的边界点 raw_boundary - 原始数据PASS区域的边界点 """ pass_points = [] # 存储通过点(绿色) fail_points = [] # 存储失败点(红色) # 存储调整后和原始数据的边界点 adjusted_boundary = [] raw_boundary = [] # 确定最大延迟值(读0x7F=127,写0xFF=255) max_delay = 0x7F if direction == 'read' else 0xFF # 确定UI范围(读2UI,写4UI) ui_range = 2 if direction == 'read' else 4 # 存储每个电压对应的延迟范围 adjusted_ranges = {} raw_ranges = {} # 遍历所有VREF值 for vref, dq_data in data.items(): # 计算实际电压 voltage = (vref / 0x1FF) * avddq # 获取当前DQ和方向的数据(调整后) dq_info = dq_data.get(dq_index, {}).get(direction) # 获取原始数据 raw_info = raw_data.get(vref, {}).get(dq_index, {}).get(direction, {}) # 遍历所有可能的延迟值 for delay in range(0, max_delay + 1): # 将延迟值转换为UI单位 ui_value = (delay / max_delay) * ui_range # 处理调整后数据 if dq_info is None: fail_points.append((ui_value, voltage)) else: min_point, max_point, _ = dq_info if min_point <= delay <= max_point: pass_points.append((ui_value, voltage)) else: fail_points.append((ui_value, voltage)) # 收集调整后数据的边界点 if dq_info is not None: min_point, max_point, _ = dq_info min_ui = (min_point / max_delay) * ui_range max_ui = (max_point / max_delay) * ui_range adjusted_ranges[voltage] = (min_ui, max_ui) # 收集原始数据的边界点 if 'min' in raw_info and 'max' in raw_info: min_val = raw_info['min'] max_val = raw_info['max'] min_ui_raw = (min_val / max_delay) * ui_range max_ui_raw = (max_val / max_delay) * ui_range raw_ranges[voltage] = (min_ui_raw, max_ui_raw) # 生成调整后PASS区域的边界点(实线) if adjusted_ranges: # 按电压排序 sorted_voltages = sorted(adjusted_ranges.keys()) # 上边界(最大延迟) for voltage in sorted_voltages: _, max_ui = adjusted_ranges[voltage] adjusted_boundary.append((max_ui, voltage)) # 下边界(最小延迟,反向排序) for voltage in reversed(sorted_voltages): min_ui, _ = adjusted_ranges[voltage] adjusted_boundary.append((min_ui, voltage)) # 闭合图形 if adjusted_boundary: adjusted_boundary.append(adjusted_boundary[0]) # 生成原始数据PASS区域的边界点(虚线) if raw_ranges: # 按电压排序 sorted_voltages = sorted(raw_ranges.keys()) # 上边界(最大延迟) for voltage in sorted_voltages: _, max_ui = raw_ranges[voltage] raw_boundary.append((max_ui, voltage)) # 下边界(最小延迟,反向排序) for voltage in reversed(sorted_voltages): min_ui, _ = raw_ranges[voltage] raw_boundary.append((min_ui, voltage)) # 闭合图形 if raw_boundary: raw_boundary.append(raw_boundary[0]) return pass_points, fail_points, adjusted_boundary, raw_boundary 输出原始数据到新日志 - 文件操作详解 def export_raw_data(raw_data, normalization_point, log_path): “”" 输出原始数据到新日志文件(按DQ划分) 参数: raw_data - 原始数据(偏移前) normalization_point - 归一化点 log_path - 原始日志文件路径 文件操作详解: 1. 创建输出目录:os.makedirs() 2. 构建文件路径:os.path.join() 3. 写入文件:open()配合write() 4. 格式化输出:f-string """ # 获取当前时间戳 timestamp = datetime.datetime.now().strftime("%Y%m%d_%H%M%S") # 获取日志文件名(不含扩展名) log_filename = os.path.basename(log_path) if '.' in log_filename: # rsplit() 从右边分割字符串,maxsplit=1表示只分割一次 log_name = log_filename.rsplit('.', 1)[0] else: log_name = log_filename # 创建输出目录 log_dir = os.path.dirname(log_path) or os.getcwd() # 获取目录或当前工作目录 output_dir = os.path.join(log_dir, "raw_data_export") # 创建输出目录路径 ########################################################## # os.makedirs() 创建目录(如果不存在) # exist_ok=True 表示目录已存在时不报错 ########################################################## os.makedirs(output_dir, exist_ok=True) # 创建输出文件路径 output_file = os.path.join(output_dir, f"{log_name}_raw_data.txt") # 写入原始数据 with open(output_file, 'w', encoding='utf-8') as f: # 写入标题信息 f.write("=" * 80 + "\n") f.write(f"DDR校准原始数据报告 (归一化点: 0x{normalization_point:X})\n") f.write(f"生成时间: {datetime.datetime.now().strftime('%Y-%m-%d %H:%M:%S')}\n") f.write(f"原始日志: {log_path}\n") f.write("=" * 80 + "\n\n") # 按vref排序 sorted_vrefs = sorted(raw_data.keys()) for vref in sorted_vrefs: # 写入vref标题 f.write(f"VREF: 0x{vref:03X}\n") # :03X表示3位十六进制大写,不足补0 f.write("-" * 60 + "\n") # 按DQ索引排序 sorted_dq = sorted(raw_data[vref].keys()) for dq_index in sorted_dq: # 写入DQ标题 f.write(f" DQ{dq_index}:\n") # 处理读方向数据 if 'read' in raw_data[vref][dq_index]: rd = raw_data[vref][dq_index]['read'] f.write(f" 读方向:\n") f.write(f" 原始最小值: {rd['min']}\n") f.write(f" 原始最大值: {rd['max']}\n") # 计算并写入窗口大小 window_size = rd['max'] - rd['min'] + 1 f.write(f" 窗口大小: {window_size}\n") # 处理写方向数据 if 'write' in raw_data[vref][dq_index]: wr = raw_data[vref][dq_index]['write'] f.write(f" 写方向:\n") f.write(f" 原始最小值: {wr['min']}\n") f.write(f" 原始最大值: {wr['max']}\n") # 计算并写入窗口大小 window_size = wr['max'] - wr['min'] + 1 f.write(f" 窗口大小: {window_size}\n") f.write("\n") # DQ间空行 f.write("\n") # VREF间空行 print(f"原始数据已导出至: {output_file}") return output_file 眼图绘制函数 - 数据可视化详解(修改后) 眼图绘制函数 - 数据可视化详解(修改后) def plot_eye_diagrams(log_content, data_rate, avddq, log_path, normalization_point): “”" 绘制DDR眼图 参数: log_content - 日志内容 data_rate - 数据速率(Mbps) avddq - AVDDQ电压(V) log_path - 日志文件路径 normalization_point - 归一化点 “”" # 设置中文字体支持 plt.rcParams[‘font.sans-serif’] = [‘SimHei’, ‘Arial Unicode MS’, ‘Microsoft YaHei’, ‘WenQuanYi Micro Hei’] plt.rcParams[‘axes.unicode_minus’] = False # 计算UI时间(皮秒) ui_ps = (1 / (data_rate * 1e6)) * 1e12 # 解析日志文件 data, raw_data = parse_log_file(log_content, normalization_point) # 导出原始数据到新日志 raw_data_file = export_raw_data(raw_data, normalization_point, log_path) # 检查数据有效性 if not data: print("错误: 无法从日志中解析出有效数据") return None, None, None # 创建图表对象 fig_write, axes_write = plt.subplots(4, 4, figsize=(20, 20)) fig_read, axes_read = plt.subplots(4, 4, figsize=(20, 20)) # 设置标题 norm_title = f" (Normalized to 0x{normalization_point:X}, Raw Data: {os.path.basename(raw_data_file)})" fig_write.suptitle(f'DDR Write Eye Diagram (Data Rate: {data_rate} Mbps, UI: {ui_ps:.2f} ps){norm_title}', fontsize=18) fig_read.suptitle(f'DDR Read Eye Diagram (Data Rate: {data_rate} Mbps, UI: {ui_ps:.2f} ps){norm_title}', fontsize=18) # 展平坐标轴数组 axes_write = axes_write.flatten() axes_read = axes_read.flatten() # 创建图例元素 legend_elements = [ Line2D([0], [0], marker='o', color='w', label='Pass', markerfacecolor='green', markersize=10), Line2D([0], [0], marker='o', color='w', label='Fail', markerfacecolor='red', markersize=10), Line2D([0], [0], color='orange', linestyle='-', label='Post-Norm Boundary'), Line2D([0], [0], color='orange', linestyle='--', label='Pre-Norm Boundary') ] # 存储分组中心点数据(读眼图) group1_center_vrefs = [] # DQ0-DQ7 group1_center_delays = [] # DQ0-DQ7 group2_center_vrefs = [] # DQ8-DQ15 group2_center_delays = [] # DQ8-DQ15 # 遍历16个DQ通道 for dq_index in range(16): # 计算写眼图指标(不计算中心点) write_width, write_height, _, _, _, _ = calculate_eye_metrics(data, avddq, dq_index, 'write') # 计算读眼图指标和中心点 read_width, read_height, read_center_ui, read_center_voltage, read_center_delay, read_center_vref = calculate_eye_metrics( data, avddq, dq_index, 'read' ) # ================= 写眼图处理 ================= # 生成写眼图数据点和边界 write_pass, write_fail, write_adjusted_boundary, write_raw_boundary = generate_eye_diagram( data, raw_data, avddq, ui_ps, dq_index, 'write' ) # 绘制写眼图 if write_fail: x_fail, y_fail = zip(*write_fail) axes_write[dq_index].scatter(x_fail, y_fail, s=1, c='red', alpha=0.1, zorder=1) if write_pass: x_pass, y_pass = zip(*write_pass) axes_write[dq_index].scatter(x_pass, y_pass, s=1, c='green', alpha=0.5, zorder=2) # 添加写眼图标注(仅显示最大眼宽和眼高) write_text = f"Max Eye Width: {write_width:.3f} UI\nMax Eye Height: {write_height:.3f} V" axes_write[dq_index].annotate( write_text, xy=(0.98, 0.02), xycoords='axes fraction', fontsize=9, ha='right', va='bottom', bbox=dict(boxstyle='round', facecolor='white', alpha=0.8) ) # 设置写眼图轴属性 axes_write[dq_index].set_title(f'DQ{dq_index} Write Eye', fontsize=12) axes_write[dq_index].set_xlabel('Delay (UI)', fontsize=10) axes_write[dq_index].set_ylabel('Voltage (V)', fontsize=10) axes_write[dq_index].set_xlim(0, 4) # 写眼图0-4UI axes_write[dq_index].set_ylim(0, avddq) axes_write[dq_index].grid(True, linestyle='--', alpha=0.6) axes_write[dq_index].legend(handles=legend_elements, loc='upper right', fontsize=9) axes_write[dq_index].tick_params(axis='both', which='major', labelsize=9) # ================= 读眼图处理 ================= # 生成读眼图数据点和边界 read_pass, read_fail, read_adjusted_boundary, read_raw_boundary = generate_eye_diagram( data, raw_data, avddq, ui_ps, dq_index, 'read' ) # 绘制读眼图 if read_fail: x_fail, y_fail = zip(*read_fail) axes_read[dq_index].scatter(x_fail, y_fail, s=1, c='red', alpha=0.1, zorder=1) if read_pass: x_pass, y_pass = zip(*read_pass) axes_read[dq_index].scatter(x_pass, y_pass, s=1, c='green', alpha=0.5, zorder=2) # 绘制调整后边界(实线) if read_adjusted_boundary: x_adj, y_adj = zip(*read_adjusted_boundary) axes_read[dq_index].plot(x_adj, y_adj, 'orange', linewidth=1.5, alpha=0.8, zorder=3) # 绘制原始数据边界(虚线) if read_raw_boundary: x_raw, y_raw = zip(*read_raw_boundary) axes_read[dq_index].plot(x_raw, y_raw, 'orange', linewidth=1.0, alpha=0.7, zorder=4) # 添加读眼图标注 read_text = f"Max Eye Width: {read_width:.3f} UI\nMax Eye Height: {read_height:.3f} V" axes_read[dq_index].annotate( read_text, xy=(0.98, 0.02), xycoords='axes fraction', fontsize=9, ha='right', va='bottom', bbox=dict(boxstyle='round', facecolor='white', alpha=0.8) ) # 设置读眼图轴属性 axes_read[dq_index].set_title(f'DQ{dq_index} Read Eye', fontsize=12) axes_read[dq_index].set_xlabel('Delay (UI)', fontsize=10) axes_read[dq_index].set_ylabel('Voltage (V)', fontsize=10) axes_read[dq_index].set_xlim(0, 2) # 读眼图0-2UI axes_read[dq_index].set_ylim(0, avddq) axes_read[dq_index].grid(True, linestyle='--', alpha=0.6) axes_read[dq_index].legend(handles=legend_elements, loc='upper right', fontsize=9) axes_read[dq_index].tick_params(axis='both', which='major', labelsize=9) # 绘制读眼图中心点和辅助线 if read_center_ui is not None and read_center_voltage is not None: # 绘制中心点 axes_read[dq_index].scatter( [read_center_ui], [read_center_voltage], s=100, marker='*', c='yellow', edgecolors='black', zorder=10 ) # 计算原始Vref值 original_vref = int(round((read_center_voltage * 0x1FF) / avddq)) # 添加中心点标注 center_text = f"Center: ({read_center_ui:.3f} UI, {read_center_voltage:.3f} V)\n" \ f"Raw: Vref=0x{original_vref:X}, Delay={read_center_delay}" axes_read[dq_index].annotate( center_text, xy=(read_center_ui, read_center_voltage), xytext=(read_center_ui + 0.1, read_center_voltage + 0.05), arrowprops=dict(facecolor='black', shrink=0.05), fontsize=8, ha='left' ) # 绘制辅助线:最大眼宽竖线(蓝色虚线) axes_read[dq_index].axvline( x=read_center_ui, color='blue', linestyle='--', alpha=0.7, label=f'Max Width Line' ) # 绘制辅助线:最大眼高横线(蓝色虚线) axes_read[dq_index].axhline( y=read_center_voltage, color='blue', linestyle='--', alpha=0.7, label=f'Max Height Line' ) # 添加辅助线图例 line_legend = [ Line2D([0], [0], color='blue', linestyle='--', label='Max Width Line'), Line2D([0], [0], color='blue', linestyle='--', label='Max Height Line') ] axes_read[dq_index].legend(handles=legend_elements + line_legend, loc='upper right', fontsize=9) # 根据DQ索引分组存储中心点数据 if dq_index < 8: group1_center_vrefs.append(original_vref) group1_center_delays.append(read_center_delay) else: group2_center_vrefs.append(original_vref) group2_center_delays.append(read_center_delay) # 计算分组统计值 def calculate_group_stats(vrefs, delays): """计算一组中心点的统计值""" if not vrefs: return None, None, None, None # 计算Vref平均值和中位数 avg_vref = sum(vrefs) / len(vrefs) sorted_vrefs = sorted(vrefs) mid = len(sorted_vrefs) // 2 if len(sorted_vrefs) % 2 == 0: median_vref = (sorted_vrefs[mid-1] + sorted_vrefs[mid]) / 2 else: median_vref = sorted_vrefs[mid] # 计算延迟平均值和中位数 avg_delay = sum(delays) / len(delays) sorted_delays = sorted(delays) mid = len(sorted_delays) // 2 if len(sorted_delays) % 2 == 0: median_delay = (sorted_delays[mid-1] + sorted_delays[mid]) / 2 else: median_delay = sorted_delays[mid] return avg_vref, median_vref, avg_delay, median_delay # 计算第一组(DQ0-DQ7)的统计值 stats1 = calculate_group_stats(group1_center_vrefs, group1_center_delays) # 计算第二组(DQ8-DQ15)的统计值 stats2 = calculate_group_stats(group2_center_vrefs, group2_center_delays) # 在图像顶部添加分组汇总信息(两行) if stats1[0] is not None: avg_vref1, median_vref1, avg_delay1, median_delay1 = stats1 # 第一组文本(DQ0-DQ7) summary_text1 = f"DQ0-DQ7 Center Points: " \ f"Avg Vref=0x{int(round(avg_vref1)):X}, " \ f"Median Vref=0x{int(median_vref1):X}, " \ f"Avg Delay={avg_delay1:.1f}, " \ f"Median Delay={median_delay1:.1f}" # 位置:0.95(顶部) fig_read.text(0.5, 0.95, summary_text1, ha='center', fontsize=12, bbox=dict(facecolor='white', alpha=0.8)) if stats2[0] is not None: avg_vref2, median_vref2, avg_delay2, median_delay2 = stats2 # 第二组文本(DQ8-DQ15) summary_text2 = f"DQ8-DQ15 Center Points: " \ f"Avg Vref=0x{int(round(avg_vref2)):X}, " \ f"Median Vref=0x{int(median_vref2):X}, " \ f"Avg Delay={avg_delay2:.1f}, " \ f"Median Delay={median_delay2:.1f}" # 位置:0.92(在上一行下方) fig_read.text(0.5, 0.92, summary_text2, ha='center', fontsize=12, bbox=dict(facecolor='white', alpha=0.8)) # 调整布局 fig_write.tight_layout(rect=[0, 0, 1, 0.96]) # 为读眼图顶部留出空间 fig_read.tight_layout(rect=[0, 0, 1, 0.90]) # 文件路径处理(添加时间戳) log_dir = os.path.dirname(log_path) or os.getcwd() timestamp = datetime.datetime.now().strftime("%Y%m%d_%H%M%S") log_filename = os.path.basename(log_path) log_name = log_filename.rsplit('.', 1)[0] if '.' in log_filename else log_filename # 构建输出文件路径(写眼图只加时间戳) write_filename = os.path.join(log_dir, f"{log_name}_ddr_write_eye_{timestamp}.png") # 构建读眼图文件名(包含两组VREF平均值) group1_avg_vref = int(round(stats1[0])) if stats1[0] is not None else 0 group2_avg_vref = int(round(stats2[0])) if stats2[0] is not None else 0 read_filename = os.path.join( log_dir, f"{log_name}_ddr_read_eye_{timestamp}_G1Vref_0x{group1_avg_vref:X}_G2Vref_0x{group2_avg_vref:X}.png" ) # 保存图像 - 修复变量名拼写错误 fig_write.savefig(write_filename, dpi=300, bbox_inches='tight') fig_read.savefig(read_filename, dpi=300, bbox_inches='tight') # 关闭图像释放内存 plt.close(fig_write) plt.close(fig_read) # 打印结果 print(f"写眼图已保存至: {write_filename}") print(f"读眼图已保存至: {read_filename}") return write_filename, read_filename, raw_data_file 主函数 - 程序入口点详解 def main(): “”" 主函数,程序入口点 功能: - 获取用户输入 - 读取日志文件 - 解析数据 - 生成眼图 - 导出结果 用户交互详解: 1. 使用input()获取用户输入 2. 使用循环处理无效输入 3. 使用try-except捕获异常 """ # 打印欢迎信息 print("=" * 50) print("DDR眼图生成器(带原始数据导出)") print("=" * 50) # 用户输入DataRate(带异常处理) while True: try: data_rate = float(input("请输入DataRate (Mbps/Pin): ")) break except ValueError: print("错误: 请输入有效的数字") # 用户输入AVDDQ电压(带异常处理) while True: try: avddq = float(input("请输入AVDDQ电压值 (V): ")) break except ValueError: print("错误: 请输入有效的数字") # 归一化点输入处理(带错误检查) while True: norm_input = input("请输入归一化点(十六进制值,如0x40或40): ").strip() if not norm_input: print("错误: 输入不能为空,请重新输入") continue try: # 处理十六进制前缀 if norm_input.startswith(("0x", "0X")): hex_str = norm_input[2:] else: hex_str = norm_input # 字符串转整数(16进制) normalization_point = int(hex_str, 16) break except ValueError: print(f"错误: '{norm_input}' 不是有效的十六进制数,请重新输入") # 日志文件路径输入(带文件存在检查) while True: log_path = input("请输入日志文件路径: ").strip() # 检查文件是否存在 # os.path.exists() 判断路径是否存在 if not os.path.exists(log_path): print(f"错误: 文件 '{log_path}' 不存在,请重新输入") else: # 获取绝对路径 log_path = os.path.abspath(log_path) break # 读取文件内容 log_content = robust_read_file(log_path) if log_content is None: print("无法读取日志文件") return # 尝试生成眼图(带异常处理) try: # 调用眼图生成函数(返回三个值) write_file, read_file, raw_data_file = plot_eye_diagrams( log_content, data_rate, avddq, log_path, normalization_point ) print("\n眼图生成成功!") print(f"原始数据文件: {raw_data_file}") except Exception as e: # 捕获所有异常并打印错误信息 print(f"眼图生成失败: {e}") # 异常对象:e.args 获取异常参数 print(f"错误详情: {e.args}") Python特殊检查 - 模块执行控制 if name == “main”: “”" name 是Python的内置变量 当脚本直接运行时,name 等于 “main” 当脚本被导入时,name 等于模块名 这种结构允许: 1. 直接运行脚本时执行测试代码 2. 作为模块导入时不执行测试代码 """ main() # 调用主函数
最新发布
08-02
class LazyLoader { constructor(dataManager) { this.dataManager = dataManager; this.cache = new Map(); this.entityTypeMap = { bancai: 'bancais', dingdan: 'dingdans', mupi: 'mupis', chanpin: 'chanpins', kucun: 'kucuns', dingdan_bancai: 'dingdan_bancais', chanpin_zujian: 'chanpin_zujians', zujian: 'zujians', caizhi: 'caizhis', dingdan_chanpin: 'dingdan_chanpins', user: 'users', jinhuo: 'jinhuos' }; } createProxy(entity, entityType) { if (entity.__isProxy) return entity; const handler = { get: (target, prop) => { if (typeof target[prop] !== 'object' || target[prop] === null) { return target[prop]; } console.log(entity) const refType = this.getReferenceType(prop); if (!refType) return target[prop]; if (Array.isArray(target[prop])) { return this.loadReferences(target[prop], refType); } return this.loadReference(target[prop], refType); } }; entity.__isProxy = true; return new Proxy(entity, handler); } getReferenceType(prop) { const baseProp = prop.replace(/\d/g, ''); if (this.entityTypeMap[baseProp]) return this.entityTypeMap[baseProp]; const pluralProp = `${baseProp}s`; if (this.dataManager._rawData[pluralProp]) return pluralProp; return null; } loadReference(ref, refType) { console.log(entity) if (!ref?.id) return ref; const cacheKey = `${refType}_${ref.id}`; if (this.cache.has(cacheKey)) return this.cache.get(cacheKey); const entities = this.dataManager._rawData[refType] || []; const entity = entities.find(e => e.id === ref.id); console.log(entity) if (!entity) return ref; const resolvedEntity = this.resolveReferences({...entity}); const proxy = this.createProxy(resolvedEntity, refType); this.cache.set(cacheKey, proxy); console.log(entity) return proxy; } loadReferences(refs, refType) { return refs.map(ref => this.loadReference(ref, refType)); } resolveReferences(entity) { for (const attr in entity) { const refType = this.getReferenceType(attr); if (!refType) continue; if (Array.isArray(entity[attr])) { entity[attr] = entity[attr].map(item => this.dataManager._rawData[refType]?.find(e => e.id === item.id) || item ); } else if (entity[attr]?.id) { entity[attr] = this.dataManager._rawData[refType]?.find(e => e.id === entity[attr].id) || entity[attr]; } } return entity; } clearCache() { this.cache.clear(); } } class MiniProgramDataManager { constructor(baseUrl = '') { this.baseUrl = baseUrl; this.debug = true; this.networkAvailable = false; this.isSyncing = false; this.lastSync = null; this.syncInterval = 5 * 60 * 1000; this.storageKey = 'miniProgramData'; this._rawData = this.createEmptyData(); this.lazyLoader = new LazyLoader(this); this.callbacks = { all: [], bancais: [], dingdans: [], mupis: [], chanpins: [], kucuns: [], chanpin_zujians: [], dingdan_bancais: [], zujians: [], caizhis: [], dingdan_chanpins: [], users: [], jinhuos: [] }; this.initNetwork(); this.loadDataFromStorage(); this.startAutoSync(); } createEmptyData() { return { bancais: [], dingdans: [], mupis: [], chanpins: [], kucuns: [], dingdan_bancais: [], chanpin_zujians: [], zujians: [], caizhis: [], dingdan_chanpins: [], users: [], jinhuos: [], _lastModified: null, _lastSync: null }; } get data() { const handler = { get: (target, prop) => { if (prop.startsWith('_')) return target[prop]; if (Array.isArray(target[prop])) { return target[prop].map(item => this.lazyLoader.createProxy(item, prop.replace(/s$/, '')) ); } return target[prop]; }, set: (target, prop, value) => { target[prop] = value; return true; } }; return new Proxy(this._rawData, handler); } async initialize() { try { await this.syncData(); return true; } catch (error) { if (this._rawData._lastSync) return true; throw error; } } startAutoSync() { this.autoSyncTimer = setInterval(() => { !this.isSyncing && this.syncData(); }, this.syncInterval); } stopAutoSync() { clearInterval(this.autoSyncTimer); } async initNetwork() { try { const { networkType } = await wx.getNetworkType(); this.networkAvailable = networkType !== 'none'; } catch { this.networkAvailable = false; } } async syncData() { if (this.isSyncing) return; this.isSyncing = true; try { const since = this._rawData._lastSync; await this.fetchAll(since); this.lazyLoader.clearCache(); this.saveDataToStorage(); this.triggerCallbacks('refresh', 'all', this.data); } catch (error) { console.error('Sync failed:', error); this.triggerCallbacks('sync_error', 'all', { error }); if (!this._rawData._lastSync) throw error; } finally { this.isSyncing = false; } } async fetchAll(since) { try { const params = since ? { since } : {}; const resolvedData = this.baseUrl ? await this.request('/app/all', 'GET', params) : this.createEmptyData(); Object.keys(this._rawData).forEach(key => { if (key.startsWith('_') || !resolvedData[key]) return; if (since) { resolvedData[key].forEach(newItem => { const index = this._rawData[key].findIndex(item => item.id === newItem.id); index >= 0 ? this._rawData[key][index] = newItem : this._rawData[key].push(newItem); }); } else { this._rawData[key] = resolvedData[key]; } }); this._rawData._lastSync = new Date().toISOString(); this.saveDataToStorage(); console.log( this._rawData) return true; } catch (error) { console.error('Fetch error:', error); this.triggerCallbacks('fetch_error', 'all', { error }); throw error; } } async request(url, method, data, retryCount = 3) { return new Promise((resolve, reject) => { const fullUrl = `${this.baseUrl}${url}`; const requestTask = () => { wx.request({ url: fullUrl, method, data, header: { 'Content-Type': 'application/json' }, success: (res) => { if (res.statusCode >= 200 && res.statusCode < 300) { resolve(res.data); } else { const err = new Error(res.data?.message || 'API error'); retryCount > 1 ? setTimeout(requestTask, 1000, retryCount - 1) : reject(err); } }, fail: (err) => { retryCount > 1 ? setTimeout(requestTask, 1000, retryCount - 1) : reject(new Error(`Network error: ${err.errMsg}`)); } }); }; requestTask(); }); } registerCallback(entity, callback) { this.callbacks[entity]?.push(callback) || this.callbacks.all.push(callback); } unregisterCallback(entity, callback) { const arr = this.callbacks[entity] || this.callbacks.all; const index = arr.indexOf(callback); if (index !== -1) arr.splice(index, 1); } triggerCallbacks(operation, entity, data) { this.callbacks.all.forEach(cb => cb(operation, entity, data)); this.callbacks[entity]?.forEach(cb => cb(operation, data)); } async crudOperation(operation, entity, data) { try { const result = await this.request(`/app/${operation}/${entity}`, 'POST', data); this.updateLocalData(operation, entity, result || data); this.triggerCallbacks(operation, entity, result || data); return result; } catch (error) { this.triggerCallbacks(`${operation}_error`, entity, { data, error }); throw error; } } updateLocalData(operation, entity, data) { const key = `${entity}s`; const collection = this._rawData[key] || []; switch (operation) { case 'add': collection.push(data); break; case 'update': const index = collection.findIndex(item => item.id === data.id); index >= 0 ? collection[index] = data : collection.push(data); break; case 'delete': const deleteIndex = collection.findIndex(item => item.id === data.id); if (deleteIndex >= 0) collection.splice(deleteIndex, 1); break; } this._rawData._lastModified = new Date().toISOString(); this.lazyLoader.clearCache(); this.saveDataToStorage(); } loadDataFromStorage() { try { const storedData = wx.getStorageSync(this.storageKey); if (storedData) this._rawData = storedData; } catch (error) { console.error('Storage load error:', error); } } saveDataToStorage() { try { wx.setStorageSync(this.storageKey, this._rawData); } catch (error) { console.error('Storage save error:', error); wx.showToast({ title: '数据保存失败', icon: 'none' }); } } async addEntity(entity, data) { return this.crudOperation('add', entity, data); } async updateEntity(entity, data) { return this.crudOperation('update', entity, data); } async deleteEntity(entity, id) { return this.crudOperation('delete', entity, { id }); } async transactionalOperation(endpoint, data) { try { await this.request(`/app/Transactional/${endpoint}`, 'POST', data); await this.syncData(); return true; } catch (error) { this.triggerCallbacks('transaction_error', endpoint, { data, error }); throw error; } } } module.exports = MiniProgramDataManager; 获取网络数据不到数据,但服务端数据已经发过来了
07-26
// 解析数据引用关系的辅助函数 /** * 解析数据中的引用关系 * 该函数用于处理嵌套的数据结构,将数据中的引用关系解析为实际的对象引用。 * 它会遍历数据中的所有实体,查找属性名中包含的数字(如"item1"), * 并尝试将这些属性值替换为对应类型数据中的实际对象引用。 * @param {Object} data - 包含嵌套引用的原始数据对象 * @returns {Object} 处理后的数据对象,其中引用已被解析为实际对象 */ function resolveDataReferences(data) { const keys = Object.keys(data); for (const key of keys) { const entities = data[key]; for (const entity of entities) { for (const attribute in entity) { if (entity.hasOwnProperty(attribute)) { // 修复:统一使用复数形式查找引用类型 let refType = attribute.replace(/\d/g, ''); // 尝试直接查找复数形式 if (!data[refType] && data[`${refType}s`]) { refType = `${refType}s`; } if (Array.isArray(entity[attribute])) { entity[attribute] = entity[attribute].map(item => data[refType]?.find(updateItem => updateItem.id === item.id) || item ); } else if (typeof entity[attribute] === "object" && entity[attribute] !== null) { entity[attribute] = data[refType]?.find(updateItem => updateItem.id === entity[attribute].id) || entity[attribute]; } } } } } return data; } // 解析单个实体的数据引用 /** * 解析数据引用关系 * 该函数用于处理实体对象与数据源之间的引用关系,自动匹配并更新实体中的引用字段。 * @param {Object} entity - 需要处理的实体对象 * @param {Object} data - 包含引用数据的数据源对象 * @returns {Object} 处理后的实体对象 * 功能说明: * 遍历实体对象的每个属性 * 如果属性值是数组,则尝试在数据源中查找匹配项更新数组元素 * 如果属性值是对象,则尝试在数据源中查找匹配项更新该对象 * 自动处理单复数形式的数据源键名 */ function resolveDataReference(entity, data) { for (const attribute in entity) { if (entity.hasOwnProperty(attribute)) { // 修复:统一使用复数形式查找引用类型 let refType = attribute.replace(/\d/g, ''); // 尝试直接查找复数形式 if (!data[refType] && data[`${refType}s`]) { refType = `${refType}s`; } if (Array.isArray(entity[attribute])) { entity[attribute] = entity[attribute].map(item => data[refType]?.find(updateItem => updateItem.id === item.id) || item ); } else if (typeof entity[attribute] === "object" && entity[attribute] !== null) { entity[attribute] = data[refType]?.find(updateItem => updateItem.id === entity[attribute].id) || entity[attribute]; } } } return entity; } /** * 懒加载处理器 * 负责管理实体类之间的关联依赖,实现按需加载 */ class LazyLoader { constructor(dataManager) { this.dataManager = dataManager; this.cache = new Map(); this.entityTypeMap = { bancai: 'bancais', dingdan: 'dingdans', mupi: 'mupis', chanpin: 'chanpins', kucun: 'kucuns', dingdan_bancai: 'dingdan_bancais', chanpin_zujian: 'chanpin_zujians', zujian: 'zujians', caizhi: 'caizhis', dingdan_chanpin: 'dingdan_chanpins', user: 'users', jinhuo: 'jinhuos' }; } /** * 创建实体代理 * @param {Object} entity 实体对象 * @param {string} entityType 实体类型 * @returns {Proxy} 返回代理后的实体 */ createProxy(entity, entityType) { const handler = { get: (target, prop) => { // 如果是普通属性,直接返回 if (typeof target[prop] !== 'object' || target[prop] === null) { return target[prop]; } // 检查是否是引用属性 const refType = this.getReferenceType(prop); if (refType) { // 如果是数组引用 if (Array.isArray(target[prop])) { return this.loadReferences(target[prop], refType); } // 如果是对象引用 else { return this.loadReference(target[prop], refType); } } return target[prop]; } }; if(!entity.__isProxy)return new Proxy(entity, handler); return entity; } /** * 获取引用类型 * @param {string} prop 属性名 * @returns {string|null} 引用类型(复数形式) */ getReferenceType(prop) { // 去除数字后缀(如 "mupi1" -> "mupi") const baseProp = prop.replace(/\d/g, ''); // 尝试直接匹配实体类型 if (this.entityTypeMap[baseProp]) { return this.entityTypeMap[baseProp]; } // 尝试添加 's' 后缀匹配 const pluralProp = `${baseProp}s`; if (this.dataManager._rawData[pluralProp]) { return pluralProp; } return null; } /** * 加载单个关联引用 * @param {Object} ref 引用对象(包含id) * @param {string} refType 引用类型(复数形式) * @returns {Promise<Object>} 解析后的实体对象 */ loadReference(ref, refType) { if (!ref || !ref.id) { return null;} const cacheKey = `${refType}_${ref.id}`; // 检查缓存 if (this.cache.has(cacheKey)) { return this.cache.get(cacheKey); } // 尝试从本地数据查找 const entities = this.dataManager._rawData[refType] || []; let entity = entities.find(e => e.id === ref.id); // 如果本地找不到,尝试同步数据 if (!entity) { //this.dataManager.syncData(); entity = (this.dataManager._rawData[refType] || []).find(e => e.id === ref.id); } if (entity) { // 递归解析嵌套引用 const resolvedEntity = resolveDataReference({...entity}, this.dataManager._rawData); // 创建代理并缓存 const proxy = this.createProxy(resolvedEntity, refType); this.cache.set(cacheKey, proxy); return proxy; } return ref; // 返回原始引用 } /** * 加载多个关联引用 * @param {Array} refs 引用对象数组 * @param {string} refType 引用类型(复数形式) * @returns {Promise<Array>} 解析后的实体对象数组 */ loadReferences(refs, refType) { if (!Array.isArray(refs)) return []; return Promise.all( refs.map(ref => this.loadReference(ref, refType)) ); } /** * 清除缓存 */ clearCache() { this.cache.clear(); } } class MiniProgramDataManager { constructor(baseUrl) { this.baseUrl = baseUrl; this.debug = true; // 调试模式开关 this.requestCount = 0; // 请求计数器 // 数据结构定义 this._rawData = { bancais: [], dingdans: [], mupis: [], chanpins: [], kucuns: [], dingdan_bancais: [], chanpin_zujians: [], zujians: [], caizhis: [], dingdan_chanpins: [], users: [], jinhuos: [], _lastModified: null, _lastSync: null }; // 初始化网络状态 this.networkAvailable = false; this.checkNetwork().then(type => { this.networkAvailable = type !== 'none'; }); this.lazyLoader = new LazyLoader(this); this.loadDataFromStorage(); this.isSyncing = false; this.lastSync = null; this.callbacks = { all: [], bancais: [], dingdan: [], mupi: [], chanpin: [], kucun: [], chanpin_zujian: [], dingdan_bancai: [], zujian: [], caizhi: [], dingdan_chanpin: [], user: [], jinhuo: [] }; this.syncQueue = Promise.resolve(); this.entiyeText = { bancai: '板材已存在', dingdan: '订单已存在', mupi: '木皮已存在', chanpin: '产品已存在', kucun: '已有库存记录', chanpin_zujian: '产品已有该组件', dingdan_bancai: '', zujian: '组件已定义过了', caizhi: '材质已定义过了', dingdan_chanpin: '订单下已有该产品', user: '' }; this.syncInterval = 5 * 60 * 1000; // 5分钟 this.storageKey = 'miniProgramData'; // 本地存储的键名 } get data() { // 创建数据代理 const handler = { get: (target, prop) => { // 处理特殊属性 if (prop.startsWith('_')) { return target[prop]; } // 处理数组类型的实体集合 if (Array.isArray(target[prop])) { return target[prop].map(item => this.lazyLoader.createProxy(item, prop.replace(/s$/, '')) ); } if (typeof target[prop] == 'object' && target[prop] === null) { return this.lazyLoader.createProxy(item, prop) } // 默认返回原始值 return target[prop]; }, // 保持其他操作不变 set: (target, prop, value) => { target[prop] = value; return true; } }; return new Proxy(this._rawData, handler); } // 添加显式初始化方法 async initialize() { console.log('初始化数据管理器...'); // 先尝试从本地存储加载数据 this.loadDataFromStorage(); // 检查是否有本地数据 const hasLocalData = this._rawData._lastSync !== null; console.log('本地存储数据状态:', hasLocalData ? '有数据' : '无数据'); // 启动自动同步 this.startAutoSync(); // 执行首次数据同步 try { await this.syncData(); console.log('数据同步完成'); // 打印数据统计 const stats = { bancais: this._rawData.bancais.length, kucuns: this._rawData.kucuns.length, dingdans: this._rawData.dingdans.length, chanpins: this._rawData.chanpins.length }; console.log('数据统计:', stats); return true; } catch (error) { console.error('数据同步失败:', error); // 如果同步失败但有本地数据,仍然返回成功 if (hasLocalData) { console.log('使用本地缓存数据'); return true; } throw error; } } /** * 启动自动同步定时器 * 每隔syncInterval毫秒检查并执行数据同步 * 如果已有同步任务进行中则跳过 */ startAutoSync() { if (this.autoSyncTimer) clearInterval(this.autoSyncTimer); this.autoSyncTimer = setInterval(() => { if (!this.isSyncing) this.syncData(); }, this.syncInterval); } /** * 停止自动同步 */ stopAutoSync() { clearInterval(this.autoSyncTimer); } /** * 检查网络状态 */ checkNetwork() { return new Promise((resolve) => { wx.getNetworkType({ success: (res) => { resolve(res.networkType); }, fail: () => { resolve('unknown'); } }); }); } /** * 获取所有数据(全量或增量) * @async * @param {string} [since] - 增量获取的时间戳,不传则全量获取 * @returns {Promise} 是否获取成功 * @description * 根据since参数决定全量或增量获取数据 * 增量获取时会合并新数据到现有数据 * 全量获取会直接替换现有数据 * 成功后会更新同步时间并保存到本地存储 * 失败时会触发错误回调,若无历史数据则抛出错误 */ async fetchAll(since) { try { console.log(since ? `增量获取数据(自${since})...` : '全量获取数据...'); let resolvedData; // 如果baseUrl为空,尝试加载本地测试数据 if (!this.baseUrl) { console.log('使用本地测试数据'); try { // 尝试从本地存储加载数据 this.loadDataFromStorage(); // 如果本地存储没有数据,尝试加载测试数据 if (!this._rawData._lastSync) { console.log('本地存储无数据,尝试加载测试数据'); // 创建测试数据 const testData = this.createTestData(); resolvedData = testData; console.log('已创建测试数据'); } else { console.log('使用本地存储的数据'); return true; // 已经从本地存储加载了数据 } } catch (e) { console.error('加载本地测试数据失败:', e); throw new Error('无法加载测试数据'); } } else { // 正常从API获取数据 const params = since ? { since } : {}; resolvedData = await this.request('/app/all', 'GET', params); } // 更新networkData if (resolvedData) { Object.keys(this._rawData).forEach(key => { if (key.startsWith('_')) return; if (resolvedData[key]) { if (since) { // 增量更新: 合并新数据到现有数据 resolvedData[key].forEach(newItem => { const index = this._rawData[key].findIndex(item => item.id === newItem.id); if (index >= 0) { this._rawData[key][index] = newItem; } else { this._rawData[key].push(newItem); } }); } else { // 全量更新: 直接替换 this._rawData[key] = resolvedData[key]; } } }); } // 更新同步时间 this.lastSync = new Date(); this._rawData._lastSync = this.lastSync.toISOString(); // 保存到本地存储 this.saveDataToStorage(); this.triggerCallbacks('refresh', 'all', this.data); return true; } catch (error) { console.error('Fetch error:', error); this.triggerCallbacks('fetch_error', 'all', { error }); // 失败时尝试使用本地数据 if (!this.lastSync) { throw new Error('初始化数据获取失败'); } return false; } } // 创建测试数据 createTestData() { console.log('创建测试数据'); // 创建材质数据 const caizhis = [ { id: 1, name: '实木' }, { id: 2, name: '密度板' }, { id: 3, name: '多层板' } ]; // 创建木皮数据 const mupis = [ { id: 1, name: '橡木', you: false }, { id: 2, name: '胡桃木', you: true }, { id: 3, name: '枫木', you: false } ]; // 创建板材数据 const bancais = [ { id: 1, houdu: 18, caizhi: { id: 1 }, mupi1: { id: 1 }, mupi2: null }, { id: 2, houdu: 25, caizhi: { id: 2 }, mupi1: { id: 2 }, mupi2: { id: 3 } }, { id: 3, houdu: 12, caizhi: { id: 3 }, mupi1: { id: 3 }, mupi2: null } ]; // 创建库存数据 const kucuns = [ { id: 1, bancai: { id: 1 }, shuliang: 100 }, { id: 2, bancai: { id: 2 }, shuliang: 50 }, { id: 3, bancai: { id: 3 }, shuliang: 75 } ]; // 创建组件数据 const zujians = [ { id: 1, name: '桌面' }, { id: 2, name: '桌腿' }, { id: 3, name: '抽屉' } ]; // 创建产品数据 const chanpins = [ { id: 1, name: '办公桌', bianhao: 'CP001' }, { id: 2, name: '餐桌', bianhao: 'CP002' }, { id: 3, name: '书桌', bianhao: 'CP003' } ]; // 创建产品组件关联 const chanpin_zujians = [ { id: 1, chanpin: { id: 1 }, zujian: { id: 1 }, bancai: { id: 1 } }, { id: 2, chanpin: { id: 1 }, zujian: { id: 2 }, bancai: { id: 2 } }, { id: 3, chanpin: { id: 2 }, zujian: { id: 1 }, bancai: { id: 3 } } ]; // 创建订单数据 const dingdans = [ { id: 1, number: 'DD001', xiadan: '2023-01-01T00:00:00.000Z', jiaohuo: '2023-01-15T00:00:00.000Z' }, { id: 2, number: 'DD002', xiadan: '2023-02-01T00:00:00.000Z', jiaohuo: '2023-02-15T00:00:00.000Z' } ]; // 创建订单产品关联 const dingdan_chanpins = [ { id: 1, dingdan: { id: 1 }, chanpin: { id: 1 }, shuliang: 2 }, { id: 2, dingdan: { id: 1 }, chanpin: { id: 2 }, shuliang: 1 }, { id: 3, dingdan: { id: 2 }, chanpin: { id: 3 }, shuliang: 3 } ]; // 创建订单板材关联 const dingdan_bancais = [ { id: 1, dingdan: { id: 1 }, chanpin: { id: 1 }, zujian: { id: 1 }, bancai: { id: 1 }, shuliang: 2 }, { id: 2, dingdan: { id: 1 }, chanpin: { id: 1 }, zujian: { id: 2 }, bancai: { id: 2 }, shuliang: 8 }, { id: 3, dingdan: { id: 1 }, chanpin: { id: 2 }, zujian: { id: 1 }, bancai: { id: 3 }, shuliang: 1 } ]; // 创建用户数据 const users = [ { id: 1, name: 'admin', password: 'admin123' } ]; // 创建进货记录 const jinhuos = [ { id: 1, kucun: { id: 1 }, shuliang: 100, date: '2023-01-01T00:00:00.000Z', text: '初始库存', theTypeOfOperation: 1, user: { id: 1 } }, { id: 2, kucun: { id: 2 }, shuliang: 50, date: '2023-01-01T00:00:00.000Z', text: '初始库存', theTypeOfOperation: 1, user: { id: 1 } }, { id: 3, kucun: { id: 3 }, shuliang: 75, date: '2023-01-01T00:00:00.000Z', text: '初始库存', theTypeOfOperation: 1, user: { id: 1 } } ]; return { bancais, dingdans, mupis, chanpins, kucuns, dingdan_bancais, chanpin_zujians, zujians, caizhis, dingdan_chanpins, users, jinhuos }; } /** * 微信小程序API请求封装 */ request(url, method = 'GET', data = null, retryCount = 3) { return new Promise((resolve, reject) => { const makeRequest = (attempt) => { const fullUrl = `${this.baseUrl}${url}`; if (this.debug) { console.log(`[请求] ${method} ${fullUrl}`, { attempt, data, timestamp: new Date().toISOString() }); } wx.request({ url: fullUrl, method, data, header: { 'Content-Type': 'application/json' }, success: (res) => { if (this.debug) { console.log(`[响应] ${fullUrl}`, { status: res.statusCode, data: res.data, headers: res.header }); } // 修复:更灵活的响应格式处理 if (!res.data) { const err = new Error('空响应数据'); if (attempt < retryCount) { this.retryRequest(makeRequest, attempt, retryCount, err); } else { reject(err); } return; } // 修复:支持多种成功状态码和响应格式 const isSuccess = res.statusCode >= 200 && res.statusCode < 300; const hasData = res.data && (res.data.data !== undefined || typeof res.data === 'object'); if (isSuccess && hasData) { resolve(res.data.data || res.data); } else { const errMsg = res.data.message || res.data.text || 'API错误'; const err = new Error(errMsg); if (attempt < retryCount) { this.retryRequest(makeRequest, attempt, retryCount, err); } else { reject(err); } } }, fail: (err) => { if (this.debug) { console.error(`[失败] ${fullUrl}`, err); } const error = new Error(`网络请求失败: ${err.errMsg || '未知错误'}`); if (attempt < retryCount) { this.retryRequest(makeRequest, attempt, retryCount, error); } else { reject(error); } } }); }; makeRequest(1); }); } retryRequest(makeRequest, attempt, retryCount, error) { const delay = 1000 * attempt; console.warn(`请求失败 (${attempt}/${retryCount}), ${delay}ms后重试:`, error.message); setTimeout(() => makeRequest(attempt + 1), delay); } /** * 注册回调函数 */ registerCallback(entity, callback) { if (!this.callbacks[entity]) { this.callbacks[entity] = []; } this.callbacks[entity].push(callback); } /** * 注销回调函数 */ unregisterCallback(entity, callback) { if (!this.callbacks[entity]) return; const index = this.callbacks[entity].indexOf(callback); if (index !== -1) { this.callbacks[entity].splice(index, 1); } } /** * 触发回调函数 */ triggerCallbacks(operation, entity, data) { this.callbacks.all.forEach(cb => cb(operation, entity, data)); if (this.callbacks[entity]) { this.callbacks[entity].forEach(cb => cb(operation, data)); } } /** * 检查重复实体 */ checkDuplicate(entity, data) { // 修复:确保引用已解析 const resolvedData = resolveDataReference(data, this.data); switch (entity) { case 'bancai': return this.data.bancais.some(b => b.houdu === resolvedData.houdu && b.caizhi?.id === resolvedData.caizhi?.id && b.mupi1?.id === resolvedData.mupi1?.id && b.mupi2?.id === resolvedData.mupi2?.id ); case 'caizhi': return this.data.caizhis.some(c => c.name === resolvedData.name); case 'mupi': return this.data.mupis.some(m => m.name === resolvedData.name && m.you === resolvedData.you ); case 'chanpin': return this.data.chanpins.some(c => c.bianhao === resolvedData.bianhao); case 'zujian': return this.data.zujians.some(z => z.name === resolvedData.name); case 'dingdan': return this.data.dingdans.some(d => d.number === resolvedData.number); case 'chanpin_zujian': return this.data.chanpin_zujians.some(cz => cz.chanpin?.id === resolvedData.chanpin?.id && cz.zujian?.id === resolvedData.zujian?.id ); case 'dingdan_chanpin': return this.data.dingdan_chanpins.some(dc => dc.dingdan?.id === resolvedData.dingdan?.id && dc.chanpin?.id === resolvedData.chanpin?.id ); case 'dingdan_bancai': return this.data.dingdan_bancais.some(db => db.dingdan?.id === resolvedData.dingdan?.id && db.chanpin?.id === resolvedData.chanpin?.id && db.zujian?.id === resolvedData.zujian?.id && db.bancai?.id === resolvedData.bancai?.id ); case 'user': return this.data.users.some(u => u.name === resolvedData.name); default: return false; } } /** * CRUD操作通用方法 */ async crudOperation(operation, entity, data) { try { // 使用微信请求API替代fetch const result = await this.request(`/app/${operation}/${entity}`, 'POST', data); this.updateLocalData(operation, entity, result || data); this.triggerCallbacks(operation, entity, result || data); return result; } catch (error) { console.error('CRUD error:', error); this.triggerCallbacks(`${operation}_error`, entity, { data, error: error.message }); throw error; } } /** * 更新本地数据 * @param {string} operation - 操作类型: 'add' | 'update' | 'delete' * @param {string} entity - 实体名称 * @param {Object} newData - 新数据对象(包含id字段) * @description 根据操作类型对本地数据进行增删改操作 */ updateLocalData(operation, entity, newData) { const key = `${entity}s`; const entities = this._rawData[key]; // 确保新数据的引用已解析 const resolvedData = resolveDataReference(newData, this._rawData); switch (operation) { case 'add': entities.push(resolvedData); break; case 'update': const index = entities.findIndex(item => item.id === resolvedData.id); if (index !== -1) { // 修复:使用对象展开操作符确保属性完整覆盖 entities[index] = { ...entities[index], ...resolvedData }; } else { entities.push(resolvedData); } break; case 'delete': const deleteIndex = entities.findIndex(item => item.id === resolvedData.id); if (deleteIndex !== -1) { entities.splice(deleteIndex, 1); } break; } // 更新最后修改时间 this._rawData._lastModified = new Date().toISOString(); // 清除懒加载缓存 this.lazyLoader.clearCache(); // 保存修改后的数据到本地存储 this.saveDataToStorage(); } /** * 同步数据方法 * 该方法用于异步获取所有数据,并处理同步过程中的并发请求 * 如果同步正在进行中,会将请求标记为待处理(pendingSync) * 同步完成后会自动处理待处理的请求 * @async * @throws {Error} 当获取数据失败时会抛出错误并记录日志 */ async syncData() { if (this.isSyncing) { this.pendingSync = true; return; } this.isSyncing = true; try { // 1. 先加载本地数据 this.loadDataFromStorage(); // 2. 获取最后同步时间,用于增量更新 const since = this._rawData._lastSync || null; // 3. 获取增量数据 await this.fetchAll(since); // 4. 清除懒加载缓存 this.lazyLoader.clearCache(); // 5. 保存更新后的数据到本地存储 this.saveDataToStorage(); // 6. 触发数据更新回调 this.triggerCallbacks('refresh', 'all', this.data); } catch (error) { console.error('Sync failed:', error); this.triggerCallbacks('sync_error', 'all', { error }); // 失败时尝试使用本地数据 if (!this._rawData._lastSync) { throw new Error('初始化数据同步失败'); } } finally { this.isSyncing = false; if (this.pendingSync) { this.pendingSync = false; this.syncData(); } } } /** * 从本地存储加载数据 * 使用微信小程序的同步存储API获取之前保存的数据 */ loadDataFromStorage() { try { const storedData = wx.getStorageSync(this.storageKey); if (storedData) { // 修复:加载到_rawData而非data代理对象 this._rawData = storedData; // 解析所有引用关系 // resolveDataReferences(this._rawData); } } catch (error) { console.error('加载本地存储数据失败:', error); // 提供默认空数据 this._rawData = { bancais: [], dingdans: [], mupis: [], chanpins: [], kucuns: [], dingdan_bancais: [], chanpin_zujians: [], zujians: [], caizhis: [], dingdan_chanpins: [], users: [], jinhuos: [], _lastModified: null, _lastSync: null }; } } /** * 保存数据到本地存储 * 使用微信小程序的同步存储API持久化当前数据 */ saveDataToStorage() { try { // 修复:保存_rawData而非localData wx.setStorageSync(this.storageKey, this._rawData); } catch (error) { console.error('保存数据到本地存储失败:', error); // 提示用户或执行降级策略 wx.showToast({ title: '数据保存失败,请稍后重试', icon: 'none' }); } } /** * 添加实体数据 * @async * @param {string} entity - 实体类型 * @param {Object} data - 要添加的实体数据 * @returns {Promise} 返回CRUD操作结果 * @throws {Error} 如果数据已存在则抛出错误 */ async addEntity(entity, data) { if (this.checkDuplicate(entity, data)) { const errorMsg = `${this.entiyeText[entity]}`; this.triggerCallbacks('duplicate_error', entity, { data, error: errorMsg }); throw new Error(errorMsg); } return this.crudOperation('add', entity, data); } /** * 更新实体 */ async updateEntity(entity, data) { return this.crudOperation('update', entity, data); } /** * 删除实体 */ async deleteEntity(entity, id) { return this.crudOperation('delete', entity, { id }); } getBancaisForZujian(zujianId) { const dingdan_bancais = this.data.dingdan_bancais.filter(db => db.zujian?.id == zujianId); return dingdan_bancais.map(db => db.bancai).filter(Boolean); } /** * 获取板材的库存信息 */ getKucunForBancai(bancaiId) { return this.data.kucuns.find(k => k.bancai?.id == bancaiId); } /** * 执行事务操作 * @param {string} endpoint - 事务接口的具体路径(例如:'kucunbianji') * @param {Object} data - 发送给接口的数据 * @returns {Promise<boolean>} 操作是否成功 */ async transactionalOperation(endpoint, data) { try { // 发送请求到事务接口 await this.request(`/app/Transactional/${endpoint}`, 'POST', data); // 事务操作成功后,主动同步增量数据 await this.syncData(); // 返回成功 return true; } catch (error) { console.error(`事务操作失败 (${endpoint}):`, error); // 触发错误回调 this.triggerCallbacks('transaction_error', endpoint, { data, error }); throw error; } } /** * 库存编辑事务操作 * @param {Object} kucunData - 库存编辑数据 * @returns {Promise<boolean>} 操作是否成功 */ async kucunBianji(kucunData) { return this.transactionalOperation('kucunbianji', kucunData); } } // 导出模块 module.exports = MiniProgramDataManager;简化优化代码量
07-26
class LazyLoader { constructor(dataManager) { this.dataManager = dataManager; this.cache = new Map(); this.entityTypeMap = { bancai: ‘bancais’, dingdan: ‘dingdans’, mupi: ‘mupis’, chanpin: ‘chanpins’, kucun: ‘kucuns’, dingdan_bancai: ‘dingdan_bancais’, chanpin_zujian: ‘chanpin_zujians’, zujian: ‘zujians’, caizhi: ‘caizhis’, dingdan_chanpin: ‘dingdan_chanpins’, user: ‘users’, jinhuo: ‘jinhuos’ }; this.r=true;this.t=true;this.y=true; } createProxy(entity, entityType) { // 1. 优先检查缓存 const cacheKey = ${entityType}_${entity.id}; if (this.cache.has(cacheKey)) { return this.cache.get(cacheKey); } // 2. 代理检测简化 const handler = { get: (target, prop, receiver) => { // 特殊属性处理 if (prop === 'id') return target.id; const value = Reflect.get(target, prop, receiver); target[prop]=value; // 基本类型直接返回 if (typeof value !== 'object' || value === null) { return value; } // 处理数组引用 if (Array.isArray(value)) { const refType = this.getReferenceType(prop); target[prop]= refType ? this.loadReferences(value, refType) : value; } // 处理对象引用 const refType = this.getReferenceType(prop); if (refType) { //console.log(refType) target[prop]= this.loadReference(value, refType); } // 普通对象:检查是否需要创建嵌套代理 if (!value.__isProxy) { // 创建嵌套对象的代理 target[prop]=this.createProxy(value, this.getEntityTypeFromRef(prop)); } return target[prop]; } }; if (entity.__isProxy){ return entity;} const proxy = new Proxy(entity, handler); proxy.__isProxy = true; // 3. 创建后立即缓存 this.cache.set(cacheKey, proxy); return proxy; } getEntityTypeFromRef(prop) { const baseProp = prop.replace(/\d/g, ‘’); return baseProp in this.entityTypeMap ? this.entityTypeMap[baseProp] : ${baseProp}s; } getReferenceType(prop) { const baseProp = prop.replace(/\d/g, ‘’); if (this.entityTypeMap[baseProp]) return this.entityTypeMap[baseProp]; const pluralProp = `${baseProp}s`; if (this.dataManager._rawData[pluralProp]) return pluralProp; return null; } loadReference(ref, refType) { if (!ref?.id) const cacheKey = `${refType}_${ref.id}`; // 4. 统一使用缓存机制 if (this.cache.has(cacheKey)) { return this.cache.get(cacheKey); } const entities = this.dataManager._rawData[refType] || []; const entity = entities.find(e => e.id === ref.id); if (!entity) { console.warn(`Entity not found: ${refType} with id ${ref.id}`); return ref; } // 5. 使用createProxy确保代理一致性 const prosty= this.createProxy(entity, refType); console.log(prosty) return prosty } loadReferences(refs, refType) { return refs.map(ref => this.loadReference(ref, refType)); } resolveReferences(entity) { for (const attr in entity) { const refType = this.getReferenceType(attr); if (!refType) continue; if (Array.isArray(entity[attr])) { entity[attr] = entity[attr].map(item => this.dataManager._rawData[refType]?.find(e => e.id === item.id) || item ); } else if (entity[attr]?.id) { entity[attr] = this.dataManager._rawData[refType]?.find(e => e.id === entity[attr].id) || entity[attr]; } } return entity; } clearCache() { this.cache.clear(); } } class MiniProgramDataManager { constructor(baseUrl = ‘’) { this.baseUrl = baseUrl; this.debug = true; this.networkAvailable = false; this.isSyncing = false; this.lastSync = null; this.syncInterval = 5 * 60 * 1000; this.storageKey = ‘miniProgramData’; this._rawData = this.createEmptyData(); this.lazyLoader = new LazyLoader(this); this.callbacks = { all: [], bancais: [], dingdans: [], mupis: [], chanpins: [], kucuns: [], chanpin_zujians: [], dingdan_bancais: [], zujians: [], caizhis: [], dingdan_chanpins: [], users: [], jinhuos: [] }; this.initNetwork(); this.loadDataFromStorage(); this.startAutoSync(); } createEmptyData() { return { bancais: [], dingdans: [], mupis: [], chanpins: [], kucuns: [], dingdan_bancais: [], chanpin_zujians: [], zujians: [], caizhis: [], dingdan_chanpins: [], users: [], jinhuos: [], _lastModified: null, _lastSync: null }; } get data() { const handler = { get: (target, prop) => { if (prop.startsWith(‘_’)) return target[prop]; if (Array.isArray(target[prop])) { return target[prop].map(item => this.lazyLoader.createProxy(item, prop.replace(/s$/, ‘’)) ); } return target[prop]; }, set: (target, prop, value) => { target[prop] = value; return true; } }; return new Proxy(this._rawData, handler); } async initialize() { try { await this.syncData(); return true; } catch (error) { if (this._rawData._lastSync) return true; throw error; } } startAutoSync() { this.autoSyncTimer = setInterval(() => { !this.isSyncing && this.syncData(); }, this.syncInterval); } stopAutoSync() { clearInterval(this.autoSyncTimer); } async initNetwork() { try { const { networkType } = await wx.getNetworkType(); this.networkAvailable = networkType !== ‘none’; } catch { this.networkAvailable = false; } } async syncData() { if (this.isSyncing) return; this.isSyncing = true; try { const since = this._rawData._lastSync; await this.fetchAll(since); this.lazyLoader.clearCache(); this.saveDataToStorage(); this.triggerCallbacks('refresh', 'all', this.data); } catch (error) { console.error('Sync failed:', error); this.triggerCallbacks('sync_error', 'all', { error }); if (!this._rawData._lastSync) throw error; } finally { this.isSyncing = false; } } async fetchAll(since) { try { const params = since ? { since } : {}; const resolvedData = this.baseUrl ? await this.request(‘/app/all’, ‘GET’, params) : this.createEmptyData(); Object.keys(this._rawData).forEach(key => { if (key.startsWith('_') || !resolvedData[key]) return; if (since) { resolvedData[key].forEach(newItem => { const index = this._rawData[key].findIndex(item => item.id === newItem.id); index >= 0 ? this._rawData[key][index] = newItem : this._rawData[key].push(newItem); }); } else { this._rawData[key] = resolvedData[key]; } }); this._rawData._lastSync = new Date().toISOString(); this.saveDataToStorage(); return true; } catch (error) { console.error('Fetch error:', error); this.triggerCallbacks('fetch_error', 'all', { error }); throw error; } } async request(url, method, data, retryCount = 3) { return new Promise((resolve, reject) => { const fullUrl = ${this.baseUrl}${url}; const requestTask = () => { wx.request({ url: fullUrl, method, data, header: { 'Content-Type': 'application/json' }, success: (res) => { if (res.statusCode >= 200 && res.statusCode < 300) { resolve(res.data.data); } else { const err = new Error(res.data?.message || 'API error'); retryCount > 1 ? setTimeout(requestTask, 1000, retryCount - 1) : reject(err); } }, fail: (err) => { retryCount > 1 ? setTimeout(requestTask, 1000, retryCount - 1) : reject(new Error(`Network error: ${err.errMsg}`)); } }); }; requestTask(); }); } registerCallback(entity, callback) { this.callbacks[entity]?.push(callback) || this.callbacks.all.push(callback); } unregisterCallback(entity, callback) { const arr = this.callbacks[entity] || this.callbacks.all; const index = arr.indexOf(callback); if (index !== -1) arr.splice(index, 1); } triggerCallbacks(operation, entity, data) { this.callbacks.all.forEach(cb => cb(operation, entity, data)); this.callbacks[entity]?.forEach(cb => cb(operation, data)); } async crudOperation(operation, entity, data) { try { const result = await this.request(/app/${operation}/${entity}, ‘POST’, data); this.updateLocalData(operation, entity, result || data); this.triggerCallbacks(operation, entity, result || data); return result; } catch (error) { this.triggerCallbacks(${operation}_error, entity, { data, error }); throw error; } } updateLocalData(operation, entity, data) { const key = ${entity}s; const collection = this._rawData[key] || []; switch (operation) { case 'add': collection.push(data); break; case 'update': const index = collection.findIndex(item => item.id === data.id); index >= 0 ? collection[index] = data : collection.push(data); break; case 'delete': const deleteIndex = collection.findIndex(item => item.id === data.id); if (deleteIndex >= 0) collection.splice(deleteIndex, 1); break; } this._rawData._lastModified = new Date().toISOString(); this.lazyLoader.clearCache(); this.saveDataToStorage(); } loadDataFromStorage() { try { const storedData = wx.getStorageSync(this.storageKey); if (storedData) this._rawData = storedData; } catch (error) { console.error(‘Storage load error:’, error); } } saveDataToStorage() { try { wx.setStorageSync(this.storageKey, this._rawData); } catch (error) { console.error(‘Storage save error:’, error); wx.showToast({ title: ‘数据保存失败’, icon: ‘none’ }); } } async addEntity(entity, data) { return this.crudOperation(‘add’, entity, data); } async updateEntity(entity, data) { return this.crudOperation(‘update’, entity, data); } async deleteEntity(entity, id) { return this.crudOperation(‘delete’, entity, { id }); } async transactionalOperation(endpoint, data) { try { await this.request(/app/Transactional/${endpoint}, ‘POST’, data); await this.syncData(); return true; } catch (error) { this.triggerCallbacks(‘transaction_error’, endpoint, { data, error }); throw error; } } } module.exports = MiniProgramDataManager;---------------------------------------------------------------------------------- const prosty= this.createProxy(entity, refType); console.log(prosty)/*这段输出有的是代理的有的不是/ return prosty------------------------------------------------{number: "2025-31", xiadan: "2025-06-28T00:00:00.000+00:00", jiaohuo: null, dingdan_chanpin: Array(0), id: 1, …} MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 15, kucun: null, …}caizhi: {id: 1}deleted: falsedeletedAt: nullhoudu: 15id: 1kucun: nulllastUpdated: "2025-06-29T19:58:31.000+00:00"mupi1: {id: 1}mupi2: {id: 1}__isProxy: true__proto__: Object MiniProgramDataManager.js? [sm]:117 {dingdan_chanpin: Array(2), bianhao: "AN-1210", chanpin_zujian: Array(1), id: 1, lastUpdated: "2025-06-29T19:58:34.000+00:00", …} MiniProgramDataManager.js? [sm]:117 Proxy {name: "侧板", chanping_zujian: Array(3), id: 1, lastUpdated: "2025-06-29T19:58:38.000+00:00", deleted: false, …} index.js? [sm]:124 undefined MiniProgramDataManager.js? [sm]:117 {number: "2025-034", xiadan: "2025-07-01T00:00:00.000+00:00", jiaohuo: null, dingdan_chanpin: Array(5), id: 3, …} MiniProgramDataManager.js? [sm]:117 {number: "2025-33", xiadan: null, jiaohuo: null, dingdan_chanpin: Array(3), id: 2, …} MiniProgramDataManager.js? [sm]:117 {number: "2025-034增加单", xiadan: "2025-07-02T00:00:00.000+00:00", jiaohuo: null, dingdan_chanpin: Array(1), id: 4, …} MiniProgramDataManager.js? [sm]:117 {number: "2025-031", xiadan: "2025-07-02T00:00:00.000+00:00", jiaohuo: null, dingdan_chanpin: Array(1), id: 5, …} MiniProgramDataManager.js? [sm]:117 {number: "2025-036", xiadan: "2025-06-13T00:00:00.000+00:00", jiaohuo: "2025-07-13T00:00:00.000+00:00", dingdan_chanpin: Array(2), id: 6, …} MiniProgramDataManager.js? [sm]:117 {number: "2025-037", xiadan: "2025-06-16T00:00:00.000+00:00", jiaohuo: "2025-07-16T00:00:00.000+00:00", dingdan_chanpin: Array(1), id: 7, …} MiniProgramDataManager.js? [sm]:117 {number: "2025-038", xiadan: "2025-06-17T00:00:00.000+00:00", jiaohuo: "2025-08-17T00:00:00.000+00:00", dingdan_chanpin: Array(1), id: 8, …} MiniProgramDataManager.js? [sm]:117 {number: "梦5", xiadan: "2025-07-03T00:00:00.000+00:00", jiaohuo: "2025-08-03T00:00:00.000+00:00", dingdan_chanpin: Array(1), id: 44, …} MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 3, kucun: null, …} MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 3, kucun: null, …} MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 3, kucun: null, …} MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 5, kucun: null, …} MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 5, kucun: null, …} MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 6, kucun: null, …} MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 6, kucun: null, …} MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 6, kucun: null, …} MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 6, kucun: null, …} MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 6, kucun: null, …} MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 9, kucun: null, …} MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 9, kucun: null, …} MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 9, kucun: null, …} MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 9, kucun: null, …} MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 12, kucun: null, …} MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 15, kucun: null, …} MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 15, kucun: null, …} MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 15, kucun: null, …} MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 18, kucun: null, …} MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 18, kucun: null, …} MiniProgramDataManager.js? [sm]:117 Proxy {bancai: Array(34), name: "纤维板", id: 1, lastUpdated: "2025-06-29T19:58:33.000+00:00", deleted: false, …} MiniProgramDataManager.js? [sm]:117 Proxy {you: true, name: "桃花心", bancaisForMupi1: Array(10), bancaisForMupi2: Array(4), id: 1, …} MiniProgramDataManager.js? [sm]:117 {dingdan_chanpin: Array(2), bianhao: "AN-1140", chanpin_zujian: Array(1), id: 3, lastUpdated: "2025-07-01T07:39:46.000+00:00", …} index.js? [sm]:124 undefined MiniProgramDataManager.js? [sm]:117 {dingdan_chanpin: Array(2), bianhao: "AN-1133", chanpin_zujian: Array(0), id: 4, lastUpdated: "2025-07-01T12:30:30.000+00:00", …} 2index.js? [sm]:124 undefined MiniProgramDataManager.js? [sm]:117 {dingdan_chanpin: Array(1), bianhao: "AN-1157", chanpin_zujian: Array(0), id: 5, lastUpdated: "2025-07-01T12:34:49.000+00:00", …} index.js? [sm]:124 undefined MiniProgramDataManager.js? [sm]:117 {dingdan_chanpin: Array(1), bianhao: "AN-1159", chanpin_zujian: Array(0), id: 6, lastUpdated: "2025-07-01T12:37:35.000+00:00", …} index.js? [sm]:124 undefined MiniProgramDataManager.js? [sm]:117 {dingdan_chanpin: Array(1), bianhao: "AN-1212", chanpin_zujian: Array(0), id: 7, lastUpdated: "2025-07-01T12:43:09.000+00:00", …} 31index.js? [sm]:124 undefined MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 3, kucun: null, …} MiniProgramDataManager.js? [sm]:117 {dingdan_chanpin: Array(1), bianhao: "AN-1081-H", chanpin_zujian: Array(0), id: 8, lastUpdated: "2025-07-02T13:05:30.000+00:00", …} index.js? [sm]:124 undefined MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 15, kucun: null, …} MiniProgramDataManager.js? [sm]:117 Proxy {you: false, name: "无", bancaisForMupi1: Array(0), bancaisForMupi2: Array(5), id: 3, …} MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 5, kucun: null, …} MiniProgramDataManager.js? [sm]:117 {dingdan_chanpin: Array(1), bianhao: "AP-1109-H", chanpin_zujian: Array(0), id: 9, lastUpdated: "2025-07-02T13:06:11.000+00:00", …} index.js? [sm]:124 undefined MiniProgramDataManager.js? [sm]:117 Proxy {you: false, name: "香杉", bancaisForMupi1: Array(7), bancaisForMupi2: Array(0), id: 4, …} MiniProgramDataManager.js? [sm]:117 Proxy {you: false, name: "桃花芯", bancaisForMupi1: Array(16), bancaisForMupi2: Array(11), id: 2, …} index.js? [sm]:124 undefined MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 6, kucun: null, …} index.js? [sm]:124 undefined MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 9, kucun: null, …} index.js? [sm]:124 undefined MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 18, kucun: null, …} index.js? [sm]:124 undefined MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 5, kucun: null, …} MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 12, kucun: null, …} 2index.js? [sm]:124 undefined MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 6, kucun: null, …} MiniProgramDataManager.js? [sm]:117 {dingdan_chanpin: Array(1), bianhao: "AP-1278", chanpin_zujian: Array(0), id: 10, lastUpdated: "2025-07-02T13:19:23.000+00:00", …} index.js? [sm]:124 undefined MiniProgramDataManager.js? [sm]:117 Proxy {you: false, name: "杂皮", bancaisForMupi1: Array(0), bancaisForMupi2: Array(6), id: 5, …} MiniProgramDataManager.js? [sm]:117 {dingdan_chanpin: Array(1), bianhao: "AP-1271", chanpin_zujian: Array(0), id: 11, lastUpdated: "2025-07-02T13:23:57.000+00:00", …} 2index.js? [sm]:124 undefined MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 9, kucun: null, …} index.js? [sm]:124 undefined MiniProgramDataManager.js? [sm]:117 Proxy {you: true, name: "杂皮20C", bancaisForMupi1: Array(0), bancaisForMupi2: Array(4), id: 7, …} MiniProgramDataManager.js? [sm]:117 {caizhi: {…}, mupi1: {…}, mupi2: {…}, houdu: 15, kucun: null, …} index.js? [sm]:124 undefined MiniProgramDataManager.js? [sm]:117 {dingdan_chanpin: Array(1), bianhao: "AN-1032", chanpin_zujian: Array(1), id: 13, lastUpdated: "2025-07-03T03:24:33.000+00:00", …} MiniProgramDataManager.js? [sm]:117 Proxy {name: "底板", chanping_zujian: Array(1), id: 2, lastUpdated: "2025-07-03T03:25:49.000+00:00", deleted: false, …} index.js? [sm]:124 undefined MiniProgramDataManager.js? [sm]:117 {dingdan_chanpin: Array(1), bianhao: "AN-1208", chanpin_zujian: Array(1), id: 2, lastUpdated: "2025-06-30T07:09:34.000+00:00", …} 材质名字都是空的
07-26
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值