记录一次File.createNewFile()报错的解决方法

本文介绍了在Android开发过程中遇到的因文件名包含非法字符“:”而导致的文件创建异常问题。通过修改文件名中的日期格式,成功避免了该错误。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

文章装载自:http://blog.youkuaiyun.com/moyuxueyi/article/details/39153725


出现异常:

04-16 17:58:52.714: W/System.err(23703): Caused by: libcore.io.ErrnoException: open failed: EINVAL (Invalid argument)

04-16 17:58:52.714: W/System.err(23703): at libcore.io.Posix.open(Native Method)
04-16 17:58:52.714: W/System.err(23703): at libcore.io.BlockGuardOs.open(BlockGuardOs.java:110)
04-16 17:58:52.714: W/System.err(23703): at libcore.io.IoBridge.open(IoBridge.java:444)

04-16 17:58:52.714: W/System.err(23703): ... 6 more


解决方法:

android中,创建文件时,文件名中不能包含“:”冒号。


根据上面的方法我检查了代码,原来在创建文件时写成了这样:

SimpleDateFormat format = new SimpleDateFormat("yyyy-MM-dd_HH:mm:ss");
fileName = "/video-" + format.format(new Date()) + ".mp4";
然后把冒号改成别的符号:

SimpleDateFormat format = new SimpleDateFormat("yyyy-MM-dd_HH-mm-ss");
fileName = "/video-" + format.format(new Date()) + ".mp4";
这样就顺利创建文件了。

public static void generateTargetMetadataXml(List<InterpretRes> interpretResList, String outputFilePath) throws Exception { try { // 创建一个xml文档 Document doc = DocumentHelper.createDocument(); // 创建根节点 Element root = doc.addElement("Targets"); // 添加目标数量元素 Element detectedNumberElement = root.addElement("DetectedNumber"); detectedNumberElement.setText(String.valueOf(interpretResList.size())); // 添加对象列表 for (int i = 0; i < interpretResList.size(); i++) { InterpretRes interpretRes = interpretResList.get(i); Element objectElement = root.addElement("Target"); // 添加目标ID Element targetIdElement = objectElement.addElement("TargetID"); targetIdElement.setText(String.valueOf(i + 1)); // 添加移动目标名称 Element targetNameElement = objectElement.addElement("TargetName"); targetNameElement.setText(interpretRes.getTargetName()); // 添加可信度 Element reliabilityElement = objectElement.addElement("Reliability"); reliabilityElement.setText(String.valueOf(interpretRes.getScore())); // 添加坐标系 Element coordinateElement = objectElement.addElement("Coordinate"); coordinateElement.setText("geodegree"); // 添加 Element boxTypeElement = objectElement.addElement("BoxType"); boxTypeElement.setText("1"); // 添加面积 Element areaElement = objectElement.addElement("Area"); areaElement.setText(String.valueOf(interpretRes.getTargetArea())); //添加长度 Element lengthElement = objectElement.addElement("Length"); lengthElement.setText(String.valueOf(interpretRes.getLength())); //添加宽度 Element widthElement = objectElement.addElement("Width"); widthElement.setText(String.valueOf(interpretRes.getWidth())); //添加角度 Element angleElement = objectElement.addElement("Angle"); angleElement.setText(String.valueOf(interpretRes.getAngle())); // 添加点集合 Element pointsElement = objectElement.addElement("Points"); // 假设geom字段包含四个坐标点,格式为"x1,y1 x2,y2 x3,y3 x4,y4" Coordinate[] coordinates = GeometryUtils.parseWKTToPoints(interpretRes.getGeom()); if (coordinates.length >= 4) { //排除最后闭合的点 Coordinate[] quadCoords = Arrays.copyOfRange(coordinates, 0, 4); // 添加左上角点 Element leftTopElement = pointsElement.addElement("Point"); leftTopElement.setText(quadCoords[0].x + "," + quadCoords[0].y); // 添加左下角点 Element leftBottomElement = pointsElement.addElement("Point"); leftBottomElement.setText(quadCoords[1].x + "," + quadCoords[1].y); // 添加右上角点 Element rightTopElement = pointsElement.addElement("Point"); rightTopElement.setText(quadCoords[2].x + "," + quadCoords[2].y); // 添加右下角点 Element rightBottomElement = pointsElement.addElement("Point"); rightBottomElement.setText(quadCoords[3].x + "," + quadCoords[3].y); } } // 用于格式化xml内容和设置头部标签 OutputFormat format = OutputFormat.createPrettyPrint(); // 设置xml文档的编码为utf-8 format.setEncoding("UTF-8"); Writer out; XMLWriter writer = null; // 创建一个输出流对象 File file = new File(outputFilePath); if (!file.getParentFile().exists()) { file.getParentFile().mkdirs(); } file.createNewFile(); out = new FileWriter(outputFilePath); // 创建一个dom4j创建xml的对象 writer = new XMLWriter(out, format); // 调用write方法将doc文档写到指定路径 writer.write(doc); writer.close(); } catch (IOException e) { e.printStackTrace(); } }有需要优化的地方吗
07-10
14:17:55.065 [Thread-29] INFO c.r.k.t.KettleUtil - [runKettleJob,288] - 运行作业[etl_plasma_out_job]携带参数={"SCHEDULEINO":"122"} 14:17:55.070 [Thread-29] INFO c.r.k.l.XLogListener - [addLogListener,248] - 任务etl_plasma_out_job日志监听启动了,日志路径D:\kettle\logs... 2025/07/28 14:17:55 - etl_plasma_out_job - 开始执行任务 14:17:55.088 [Thread-46] WARN o.pentaho.di.job.Job - [logToLogger,92] - [D:\kettle\workspace\调试\base_etl_online\etl_plasma_out.kjb] 开始执行任务 14:17:55.101 [Thread-46] DEBUG c.r.k.l.XLogListener - [eventAdded,238] - 因为异常丢失的日志{"timeStamp":1753683475085,"level":"MINIMAL","message":{"logChannelId":"48709c82-99b7-4e0b-a797-50dba245c27b","level":"MINIMAL","subject":"etl_plasma_out_job","error":false,"message":"开始执行任务"}} 2025/07/28 14:17:55 - etl_plasma_out_job - 开始项[浆站编号入参] 14:17:55.458 [Thread-46] INFO o.pentaho.di.job.Job - [logToLogger,96] - [D:\kettle\workspace\调试\base_etl_online\etl_plasma_out.kjb] 开始项[浆站编号入参] 14:17:55.458 [Thread-46] DEBUG c.r.k.l.XLogListener - [eventAdded,238] - 因为异常丢失的日志{"timeStamp":1753683475458,"level":"BASIC","message":{"logChannelId":"48709c82-99b7-4e0b-a797-50dba245c27b","level":"BASIC","subject":"etl_plasma_out_job","error":false,"message":"开始项[浆站编号入参]"}} 2025/07/28 14:17:55 - 自定义日志输出 - 传入参数:'122' 14:17:55.590 [Thread-46] DEBUG c.r.k.l.XLogListener - [eventAdded,238] - 因为异常丢失的日志{"timeStamp":1753683475590,"level":"MINIMAL","message":{"logChannelId":"b647af09-3ce1-4ea1-822d-1d676c725f2f","level":"MINIMAL","subject":"自定义日志输出","error":false,"message":"传入参数:'122'"}} 2025/07/28 14:17:55 - etl_plasma_out_job - 开始项[调度任务] 14:17:55.591 [Thread-46] INFO o.pentaho.di.job.Job - [logToLogger,96] - [D:\kettle\workspace\调试\base_etl_online\etl_plasma_out.kjb] 开始项[调度任务] 14:17:55.591 [Thread-46] DEBUG c.r.k.l.XLogListener - [eventAdded,238] - 因为异常丢失的日志{"timeStamp":1753683475591,"level":"BASIC","message":{"logChannelId":"48709c82-99b7-4e0b-a797-50dba245c27b","level":"BASIC","subject":"etl_plasma_out_job","error":false,"message":"开始项[调度任务]"}} 2025/07/28 14:17:55 - 调度任务 - Using run configuration [Pentaho local] 14:17:55.955 [Thread-46] INFO o.pentaho.di.job.Job - [logToLogger,96] - [D:\kettle\workspace\调试\base_etl_online\etl_plasma_out.kjb] Using run configuration [Pentaho local] 14:17:55.956 [Thread-46] DEBUG c.r.k.l.XLogListener - [eventAdded,238] - 因为异常丢失的日志{"timeStamp":1753683475955,"level":"BASIC","message":{"logChannelId":"84bfa1a0-89fe-4674-b2bd-516d3d15df3c","level":"BASIC","subject":"调度任务","arguments":["Pentaho local"],"error":false,"message":"Using run configuration [Pentaho local]"}} 2025/07/28 14:17:55 - 调度任务 - Using legacy execution engine 14:17:55.959 [Thread-46] INFO o.pentaho.di.job.Job - [logToLogger,96] - [D:\kettle\workspace\调试\base_etl_online\etl_plasma_out.kjb] Using legacy execution engine 14:17:55.959 [Thread-46] DEBUG c.r.k.l.XLogListener - [eventAdded,238] - 因为异常丢失的日志{"timeStamp":1753683475959,"level":"BASIC","message":{"logChannelId":"84bfa1a0-89fe-4674-b2bd-516d3d15df3c","level":"BASIC","subject":"调度任务","error":false,"message":"Using legacy execution engine"}} 2025/07/28 14:17:55 - 调度任务初始化 - 为了转换解除补丁开始 [调度任务初始化] 14:17:55.962 [Thread-46] INFO o.p.di.trans.Trans - [logToLogger,96] - [D:\kettle\workspace\调试\base_etl_online\etl_plasma_out.kjb file:///D:/kettle/workspace/调试/base_etl_online/t_etl_scheduler_online.ktr] 为了转换解除补丁开始 [调度任务初始化] 14:17:55.962 [Thread-46] DEBUG c.r.k.l.XLogListener - [eventAdded,238] - 因为异常丢失的日志{"timeStamp":1753683475962,"level":"BASIC","message":{"logChannelId":"0635217f-7127-414b-a91d-c76634f60bdb","level":"BASIC","subject":"调度任务初始化","error":false,"message":"为了转换解除补丁开始 [调度任务初始化]"}} Loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary. 14:17:56.434 [Thread-36] INFO c.r.k.t.KettleUtil - [runKettleJob,288] - 运行作业[etl_plasma_out_job]携带参数={"SCHEDULEINO":"123"} 14:17:56.434 [Thread-36] INFO c.r.k.l.XLogListener - [addLogListener,248] - 任务etl_plasma_out_job日志监听启动了,日志路径D:\kettle\logs... 2025/07/28 14:17:56 - etl_plasma_out_job - 开始执行任务 14:17:56.439 [Thread-50] WARN o.pentaho.di.job.Job - [logToLogger,92] - [D:\kettle\workspace\调试\base_etl_online\etl_plasma_out.kjb] 开始执行任务 14:17:56.439 [Thread-50] DEBUG c.r.k.l.XLogListener - [eventAdded,238] - 因为异常丢失的日志{"timeStamp":1753683476438,"level":"MINIMAL","message":{"logChannelId":"48709c82-99b7-4e0b-a797-50dba245c27b","level":"MINIMAL","subject":"etl_plasma_out_job","error":false,"message":"开始执行任务"}} 14:17:56.440 [Thread-50] DEBUG c.r.k.l.XLogListener - [eventAdded,238] - 因为异常丢失的日志{"timeStamp":1753683476438,"level":"MINIMAL","message":{"logChannelId":"48709c82-99b7-4e0b-a797-50dba245c27b","level":"MINIMAL","subject":"etl_plasma_out_job","error":false,"message":"开始执行任务"}} 14:17:56.482 [http-nio-8888-exec-77] DEBUG c.r.k.m.K.selectKettleJobList_COUNT - [debug,135] - ==> Preparing: SELECT count(0) FROM kettle_job WHERE (role_key LIKE concat('%', ?, '%')) AND (is_del != 1 OR is_del IS NULL) 14:17:56.484 [http-nio-8888-exec-77] DEBUG c.r.k.m.K.selectKettleJobList_COUNT - [debug,135] - ==> Parameters: admin(String) 14:17:56.491 [http-nio-8888-exec-77] DEBUG c.r.k.m.K.selectKettleJobList_COUNT - [debug,135] - <== Total: 1 14:17:56.493 [http-nio-8888-exec-77] DEBUG c.r.k.m.K.selectKettleJobList - [debug,135] - ==> Preparing: select id, created_time, update_time, created_by, update_by, job_name, job_description, job_type, job_path, job_repository_id, job_log_level, job_status, is_del, is_monitor_enabled, role_key, tpl_key,last_succeed_time from kettle_job WHERE ( role_key like concat('%',?, '%') ) and (is_del != 1 or is_del is null) LIMIT ? 14:17:56.494 [http-nio-8888-exec-77] DEBUG c.r.k.m.K.selectKettleJobList - [debug,135] - ==> Parameters: admin(String), 10(Integer) 14:17:56.513 [http-nio-8888-exec-77] DEBUG c.r.k.m.K.selectKettleJobList - [debug,135] - <== Total: 5 2025/07/28 14:17:56 - etl_plasma_out_job - 开始项[浆站编号入参] 14:17:56.672 [Thread-50] INFO o.pentaho.di.job.Job - [logToLogger,96] - [D:\kettle\workspace\调试\base_etl_online\etl_plasma_out.kjb] 开始项[浆站编号入参] 14:17:56.672 [Thread-50] DEBUG c.r.k.l.XLogListener - [eventAdded,238] - 因为异常丢失的日志{"timeStamp":1753683476671,"level":"BASIC","message":{"logChannelId":"48709c82-99b7-4e0b-a797-50dba245c27b","level":"BASIC","subject":"etl_plasma_out_job","error":false,"message":"开始项[浆站编号入参]"}} 14:17:56.673 [Thread-50] DEBUG c.r.k.l.XLogListener - [eventAdded,238] - 因为异常丢失的日志{"timeStamp":1753683476671,"level":"BASIC","message":{"logChannelId":"48709c82-99b7-4e0b-a797-50dba245c27b","level":"BASIC","subject":"etl_plasma_out_job","error":false,"message":"开始项[浆站编号入参]"}} 2025/07/28 14:17:56 - 自定义日志输出 - 传入参数:'123' package com.ruoyi.kettle.listener; import com.alibaba.fastjson.JSON; import com.ruoyi.kettle.domain.KettleJob; import com.ruoyi.kettle.service.SpringContextUtil; import com.ruoyi.kettle.tools.DateHelper; import com.ruoyi.kettle.tools.GenCodeUtil; import com.ruoyi.kettle.tools.KettleUtil; import org.pentaho.di.core.logging.*; import lombok.extern.slf4j.Slf4j; import org.apache.commons.lang.StringUtils; import org.pentaho.di.core.Const; import org.pentaho.di.core.exception.KettleException; import org.pentaho.di.job.Job; import org.pentaho.di.trans.Trans; import java.io.*; import java.text.SimpleDateFormat; import java.util.Date; import java.util.List; import java.util.Map; import java.util.concurrent.ConcurrentHashMap; import java.util.regex.Matcher; import java.util.regex.Pattern; /** * @Author: yuenbin * @Date :2020/11/26 * @Time :15:32 * @Motto: It is better to be clear than to be clever ! * @Destrib: 增加kettle执行日志完整监听 **/ @Slf4j public class XLogListener implements KettleLoggingEventListener { public static Map<Object, File> jobFileMap = new ConcurrentHashMap<>(); public KettleLogLayout layout =new KettleLogLayout(true); public Pattern pattern ; public OutputStream outputStream ; /** * 记录一下当前监听的线程 */ private static Map<String, Object> activeThreadMap = new ConcurrentHashMap<>(); /** * 记录开始时间 */ private static Map<String, String> startTimeMap = new ConcurrentHashMap<>(); private String logFilePath = ""; private KettleJob kettleJob; private String logFileSize = "20"; private String logChannelId; private String threadId; private String logType; /***定义全局的日志ID*/ private String logId; private StringBuilder logStr = new StringBuilder(); private Object obj = null; public XLogListener() { } public XLogListener(String logFilePath, Object obj, KettleJob kettleJob) throws KettleException { this.logFilePath = logFilePath; this.obj = obj; this.kettleJob = kettleJob; setOutputStream(addLogListener(logFilePath, obj,kettleJob)); this.setLogId(GenCodeUtil.nextId()); if (obj instanceof Job) { Job job = (Job) obj; this.setLogType("job"); this.setLogChannelId(job.getLogChannelId() +'_'+ kettleJob.getId()); this.setThreadId(kettleJob.getId().toString()); } else { Trans trans = (Trans) obj; this.setLogType("trans"); this.setLogChannelId(trans.getLogChannelId()); this.setThreadId(trans.getObjectId().getId()); } } public boolean writeFileLog(KettleLoggingEvent event,KettleJob kettleJob) { try { Object messageObject = event.getMessage(); if (messageObject instanceof LogMessage) { boolean logToFile = false; if (this.getLogChannelId() == null) { logToFile = true; } else { LogMessage message = (LogMessage) messageObject; List<String> logChannelChildren = LoggingRegistry.getInstance().getLogChannelChildren(this.getLogChannelId()); logToFile = Const.indexOfString(message.getLogChannelId(), logChannelChildren) >= 0; } if (logToFile) { String logText = this.layout.format(event); this.getOutputStream().write(logText.getBytes()); this.getOutputStream().write(Const.CR.getBytes()); return true; } } } catch (IOException e) { log.error("写入日志出现异常,原因为:", e); } return false; } public boolean writeDbLog(KettleLoggingEvent event,KettleJob kettleJob) { LogService logService = (LogService) SpringContextUtil.getBean("logService", LogService.class); try { Object messageObject = event.getMessage(); if (messageObject instanceof LogMessage) { boolean logToDb = false; if (this.getLogChannelId() == null) { logToDb = true; } else { LogMessage message = (LogMessage) messageObject; List<String> logChannelChildren = LoggingRegistry.getInstance().getLogChannelChildren(this.getLogChannelId()); logToDb = Const.indexOfString(message.getLogChannelId(), logChannelChildren) >= 0; } if (logToDb) { String logText = this.layout.format(event); String type = ""; //取默认值 正在运行中 String recordStatus = ""; String startTime = ""; String endTime = ""; String logFile = ""; if (obj instanceof Job) { Job job = (Job) obj; type = "job"; startTime = startTimeMap.get(this.getThreadId()); logFile = this.jobFileMap.get(job).getAbsolutePath(); recordStatus = KettleUtil.getJobStatus(job).value(); } else if (obj instanceof Trans) { Trans trans = (Trans) obj; type = "trans"; startTime = startTimeMap.get(this.getThreadId()); logFile = this.jobFileMap.get(trans).getAbsolutePath(); recordStatus = KettleUtil.getTransStatus(trans).value(); } /**保存日志执行记录到数据库*/ String logId = this.getLogId(); /* logService.addLog(logId, this.getThreadId(), type, recordStatus, logFile, this.logStr.append(logText).append((char) 13).append((char) 10).toString(), startTime, endTime);*/ return true; } } } catch (Exception e) { log.error("写入日志出现异常,原因为:", e); } return false; } public void recordWarningLog(KettleLoggingEvent event,KettleJob kettleJob) { LogService logService = (LogService) SpringContextUtil.getBean("logService", LogService.class); Object object = event.getMessage(); LogMessage message = (LogMessage) object; String joblogStr = message.getMessage(); pattern = Pattern.compile("(error)"); Matcher m = pattern.matcher(joblogStr); if (m.find() || message.getLevel().isError()) { String msg = getExceptionMsg(joblogStr, m); String logLevel = message.getLevel().getLevel() + ""; String error = String.valueOf(message.isError()); String subject = message.getSubject(); String logChannel = message.getLogChannelId(); String logFile = this.logFilePath; String targetId = threadId; String type = ""; String targetName = "未知线程任务:" + Thread.currentThread().getName(); if (obj instanceof Job) { Job job = (Job) obj; logFile = jobFileMap.get(job).getAbsolutePath(); targetId = kettleJob.getId().toString(); targetName = kettleJob.getJobName(); type = "job"; } else if (obj instanceof Trans) { Trans trans = (Trans) obj; logFile = jobFileMap.get(trans).getAbsolutePath(); targetId = trans.getObjectId().getId(); targetName = trans.getTransMeta().getName(); type = "trans"; } logService.doAddLogWarning(logChannel, targetId, targetName, logFile, error, msg, subject, logLevel, type); log.info("异常日志已保存入库!"); } } @Override public void eventAdded(KettleLoggingEvent event) { eventAdded( event, kettleJob); } public void eventAdded(KettleLoggingEvent event,KettleJob kettleJob) { boolean failed = true; try { synchronized (new Object()) { if (writeFileLog(event,kettleJob)) { failed = false; try { getOutputStream().flush(); recordWarningLog(event,kettleJob); /**同时写数据库*/ Thread.sleep(50); writeDbLog(event,kettleJob); } catch (Exception ex) { failed = true; log.error("日志文件写入出现异常,原因:{}", ex); } } } if (failed) { log.debug("因为异常丢失的日志{}", JSON.toJSON(event)); recordWarningLog(event,kettleJob); } } catch (Exception ex) { log.error("作业日志处理失败{},原因为,{}", JSON.toJSONString(event), ex); } } public OutputStream addLogListener(String logPath, Object obj,KettleJob kettleJob) throws KettleException { log.info("任务{}日志监听启动了,日志路径{}...", obj, logPath); logFilePath = logPath; String target; String targetName; if (obj instanceof Job) { Job job = (Job) obj; target = "job"; targetName = kettleJob.getId().toString()+"_"+kettleJob.getJobName(); activeThreadMap.put(kettleJob.getId().toString(), job); startTimeMap.put(kettleJob.getId().toString(), DateHelper.format(new Date())); } else { Trans trans = (Trans) obj; target = "trans"; targetName = trans.getTransMeta().getName(); activeThreadMap.put(trans.getObjectId().getId(), trans); startTimeMap.put(trans.getObjectId().getId(), DateHelper.format(new Date())); } try { File file = getLogFile(target, targetName); if (file == null) { throw new KettleException("必须指定日志文件物理路径!"); } jobFileMap.put(obj, file); return new FileOutputStream(file, true); } catch (KettleException e) { throw new KettleException("出现异常,原因:" + e); } catch (FileNotFoundException e) { throw new KettleException("出现异常,原因:" + e); } } private File getLogFile(String target, String targetName) { File file = null; synchronized (new Object()) { /**如果定义了日志存储的物理路径,则将日志写入到磁盘一份*/ if (StringUtils.isNotBlank(logFilePath)) { logFilePath = logFilePath.replaceAll("\\\\", "\\/"); file = new File(logFilePath + "/" + target + "/" + targetName + "/"); if (!file.exists()) { file.mkdirs(); } StringBuilder logFilePathString = new StringBuilder(); logFilePathString .append(file.getAbsolutePath()).append("/") .append(new SimpleDateFormat("yyyyMMddHHmmssSSS").format(new Date())) .append(".") .append("txt"); file = new File(logFilePathString.toString()); if (!file.exists()) { try { file.createNewFile(); } catch (IOException e) { log.error("创建文件出现异常,原因为:", e); } } } } return file; } public String getLogChannelId() { return logChannelId; } public void setLogChannelId(String logChannelId) { this.logChannelId = logChannelId; } public String getThreadId() { return threadId; } public void setThreadId(String threadId) { this.threadId = threadId; } public String getLogType() { return logType; } public void setLogType(String logType) { this.logType = logType; } public String getLogId() { return logId; } public void setLogId(String logId) { this.logId = logId; } public void close() throws KettleException { if(outputStream != null){ try { outputStream.close(); } catch (IOException e) { throw new KettleException(e); } } } public OutputStream getOutputStream() { return outputStream; } public void setOutputStream(OutputStream outputStream) { this.outputStream = outputStream; } public String getExceptionMsg(String joblogStr, Matcher m) { if (joblogStr.length() <= 3000) { return joblogStr; } else if (!m.find()) { return joblogStr.substring(0, 3000); } else if (m.start() <= 100) { return joblogStr.substring(0, 3000); } else { return joblogStr.length() - m.start() + 100 <= 3000 ? joblogStr.substring(m.start() - 100) : joblogStr.substring(m.start() - 100, m.start() + 2900); } } } 修改了 this.setLogChannelId(job.getLogChannelId() +'_'+ kettleJob.getId()); 运行后报错如上
最新发布
07-29
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值