14:17:55.065 [Thread-29] INFO c.r.k.t.KettleUtil - [runKettleJob,288] - 运行作业[etl_plasma_out_job]携带参数={"SCHEDULEINO":"122"}
14:17:55.070 [Thread-29] INFO c.r.k.l.XLogListener - [addLogListener,248] - 任务etl_plasma_out_job日志监听启动了,日志路径D:\kettle\logs...
2025/07/28 14:17:55 - etl_plasma_out_job - 开始执行任务
14:17:55.088 [Thread-46] WARN o.pentaho.di.job.Job - [logToLogger,92] - [D:\kettle\workspace\调试\base_etl_online\etl_plasma_out.kjb] 开始执行任务
14:17:55.101 [Thread-46] DEBUG c.r.k.l.XLogListener - [eventAdded,238] - 因为异常丢失的日志{"timeStamp":1753683475085,"level":"MINIMAL","message":{"logChannelId":"48709c82-99b7-4e0b-a797-50dba245c27b","level":"MINIMAL","subject":"etl_plasma_out_job","error":false,"message":"开始执行任务"}}
2025/07/28 14:17:55 - etl_plasma_out_job - 开始项[浆站编号入参]
14:17:55.458 [Thread-46] INFO o.pentaho.di.job.Job - [logToLogger,96] - [D:\kettle\workspace\调试\base_etl_online\etl_plasma_out.kjb] 开始项[浆站编号入参]
14:17:55.458 [Thread-46] DEBUG c.r.k.l.XLogListener - [eventAdded,238] - 因为异常丢失的日志{"timeStamp":1753683475458,"level":"BASIC","message":{"logChannelId":"48709c82-99b7-4e0b-a797-50dba245c27b","level":"BASIC","subject":"etl_plasma_out_job","error":false,"message":"开始项[浆站编号入参]"}}
2025/07/28 14:17:55 - 自定义日志输出 - 传入参数:'122'
14:17:55.590 [Thread-46] DEBUG c.r.k.l.XLogListener - [eventAdded,238] - 因为异常丢失的日志{"timeStamp":1753683475590,"level":"MINIMAL","message":{"logChannelId":"b647af09-3ce1-4ea1-822d-1d676c725f2f","level":"MINIMAL","subject":"自定义日志输出","error":false,"message":"传入参数:'122'"}}
2025/07/28 14:17:55 - etl_plasma_out_job - 开始项[调度任务]
14:17:55.591 [Thread-46] INFO o.pentaho.di.job.Job - [logToLogger,96] - [D:\kettle\workspace\调试\base_etl_online\etl_plasma_out.kjb] 开始项[调度任务]
14:17:55.591 [Thread-46] DEBUG c.r.k.l.XLogListener - [eventAdded,238] - 因为异常丢失的日志{"timeStamp":1753683475591,"level":"BASIC","message":{"logChannelId":"48709c82-99b7-4e0b-a797-50dba245c27b","level":"BASIC","subject":"etl_plasma_out_job","error":false,"message":"开始项[调度任务]"}}
2025/07/28 14:17:55 - 调度任务 - Using run configuration [Pentaho local]
14:17:55.955 [Thread-46] INFO o.pentaho.di.job.Job - [logToLogger,96] - [D:\kettle\workspace\调试\base_etl_online\etl_plasma_out.kjb] Using run configuration [Pentaho local]
14:17:55.956 [Thread-46] DEBUG c.r.k.l.XLogListener - [eventAdded,238] - 因为异常丢失的日志{"timeStamp":1753683475955,"level":"BASIC","message":{"logChannelId":"84bfa1a0-89fe-4674-b2bd-516d3d15df3c","level":"BASIC","subject":"调度任务","arguments":["Pentaho local"],"error":false,"message":"Using run configuration [Pentaho local]"}}
2025/07/28 14:17:55 - 调度任务 - Using legacy execution engine
14:17:55.959 [Thread-46] INFO o.pentaho.di.job.Job - [logToLogger,96] - [D:\kettle\workspace\调试\base_etl_online\etl_plasma_out.kjb] Using legacy execution engine
14:17:55.959 [Thread-46] DEBUG c.r.k.l.XLogListener - [eventAdded,238] - 因为异常丢失的日志{"timeStamp":1753683475959,"level":"BASIC","message":{"logChannelId":"84bfa1a0-89fe-4674-b2bd-516d3d15df3c","level":"BASIC","subject":"调度任务","error":false,"message":"Using legacy execution engine"}}
2025/07/28 14:17:55 - 调度任务初始化 - 为了转换解除补丁开始 [调度任务初始化]
14:17:55.962 [Thread-46] INFO o.p.di.trans.Trans - [logToLogger,96] - [D:\kettle\workspace\调试\base_etl_online\etl_plasma_out.kjb file:///D:/kettle/workspace/调试/base_etl_online/t_etl_scheduler_online.ktr] 为了转换解除补丁开始 [调度任务初始化]
14:17:55.962 [Thread-46] DEBUG c.r.k.l.XLogListener - [eventAdded,238] - 因为异常丢失的日志{"timeStamp":1753683475962,"level":"BASIC","message":{"logChannelId":"0635217f-7127-414b-a91d-c76634f60bdb","level":"BASIC","subject":"调度任务初始化","error":false,"message":"为了转换解除补丁开始 [调度任务初始化]"}}
Loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary.
14:17:56.434 [Thread-36] INFO c.r.k.t.KettleUtil - [runKettleJob,288] - 运行作业[etl_plasma_out_job]携带参数={"SCHEDULEINO":"123"}
14:17:56.434 [Thread-36] INFO c.r.k.l.XLogListener - [addLogListener,248] - 任务etl_plasma_out_job日志监听启动了,日志路径D:\kettle\logs...
2025/07/28 14:17:56 - etl_plasma_out_job - 开始执行任务
14:17:56.439 [Thread-50] WARN o.pentaho.di.job.Job - [logToLogger,92] - [D:\kettle\workspace\调试\base_etl_online\etl_plasma_out.kjb] 开始执行任务
14:17:56.439 [Thread-50] DEBUG c.r.k.l.XLogListener - [eventAdded,238] - 因为异常丢失的日志{"timeStamp":1753683476438,"level":"MINIMAL","message":{"logChannelId":"48709c82-99b7-4e0b-a797-50dba245c27b","level":"MINIMAL","subject":"etl_plasma_out_job","error":false,"message":"开始执行任务"}}
14:17:56.440 [Thread-50] DEBUG c.r.k.l.XLogListener - [eventAdded,238] - 因为异常丢失的日志{"timeStamp":1753683476438,"level":"MINIMAL","message":{"logChannelId":"48709c82-99b7-4e0b-a797-50dba245c27b","level":"MINIMAL","subject":"etl_plasma_out_job","error":false,"message":"开始执行任务"}}
14:17:56.482 [http-nio-8888-exec-77] DEBUG c.r.k.m.K.selectKettleJobList_COUNT - [debug,135] - ==> Preparing: SELECT count(0) FROM kettle_job WHERE (role_key LIKE concat('%', ?, '%')) AND (is_del != 1 OR is_del IS NULL)
14:17:56.484 [http-nio-8888-exec-77] DEBUG c.r.k.m.K.selectKettleJobList_COUNT - [debug,135] - ==> Parameters: admin(String)
14:17:56.491 [http-nio-8888-exec-77] DEBUG c.r.k.m.K.selectKettleJobList_COUNT - [debug,135] - <== Total: 1
14:17:56.493 [http-nio-8888-exec-77] DEBUG c.r.k.m.K.selectKettleJobList - [debug,135] - ==> Preparing: select id, created_time, update_time, created_by, update_by, job_name, job_description, job_type, job_path, job_repository_id, job_log_level, job_status, is_del, is_monitor_enabled, role_key, tpl_key,last_succeed_time from kettle_job WHERE ( role_key like concat('%',?, '%') ) and (is_del != 1 or is_del is null) LIMIT ?
14:17:56.494 [http-nio-8888-exec-77] DEBUG c.r.k.m.K.selectKettleJobList - [debug,135] - ==> Parameters: admin(String), 10(Integer)
14:17:56.513 [http-nio-8888-exec-77] DEBUG c.r.k.m.K.selectKettleJobList - [debug,135] - <== Total: 5
2025/07/28 14:17:56 - etl_plasma_out_job - 开始项[浆站编号入参]
14:17:56.672 [Thread-50] INFO o.pentaho.di.job.Job - [logToLogger,96] - [D:\kettle\workspace\调试\base_etl_online\etl_plasma_out.kjb] 开始项[浆站编号入参]
14:17:56.672 [Thread-50] DEBUG c.r.k.l.XLogListener - [eventAdded,238] - 因为异常丢失的日志{"timeStamp":1753683476671,"level":"BASIC","message":{"logChannelId":"48709c82-99b7-4e0b-a797-50dba245c27b","level":"BASIC","subject":"etl_plasma_out_job","error":false,"message":"开始项[浆站编号入参]"}}
14:17:56.673 [Thread-50] DEBUG c.r.k.l.XLogListener - [eventAdded,238] - 因为异常丢失的日志{"timeStamp":1753683476671,"level":"BASIC","message":{"logChannelId":"48709c82-99b7-4e0b-a797-50dba245c27b","level":"BASIC","subject":"etl_plasma_out_job","error":false,"message":"开始项[浆站编号入参]"}}
2025/07/28 14:17:56 - 自定义日志输出 - 传入参数:'123'
package com.ruoyi.kettle.listener;
import com.alibaba.fastjson.JSON;
import com.ruoyi.kettle.domain.KettleJob;
import com.ruoyi.kettle.service.SpringContextUtil;
import com.ruoyi.kettle.tools.DateHelper;
import com.ruoyi.kettle.tools.GenCodeUtil;
import com.ruoyi.kettle.tools.KettleUtil;
import org.pentaho.di.core.logging.*;
import lombok.extern.slf4j.Slf4j;
import org.apache.commons.lang.StringUtils;
import org.pentaho.di.core.Const;
import org.pentaho.di.core.exception.KettleException;
import org.pentaho.di.job.Job;
import org.pentaho.di.trans.Trans;
import java.io.*;
import java.text.SimpleDateFormat;
import java.util.Date;
import java.util.List;
import java.util.Map;
import java.util.concurrent.ConcurrentHashMap;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
/**
* @Author: yuenbin
* @Date :2020/11/26
* @Time :15:32
* @Motto: It is better to be clear than to be clever !
* @Destrib: 增加kettle执行日志完整监听
**/
@Slf4j
public class XLogListener implements KettleLoggingEventListener {
public static Map<Object, File> jobFileMap = new ConcurrentHashMap<>();
public KettleLogLayout layout =new KettleLogLayout(true);
public Pattern pattern ;
public OutputStream outputStream ;
/**
* 记录一下当前监听的线程
*/
private static Map<String, Object> activeThreadMap = new ConcurrentHashMap<>();
/**
* 记录开始时间
*/
private static Map<String, String> startTimeMap = new ConcurrentHashMap<>();
private String logFilePath = "";
private KettleJob kettleJob;
private String logFileSize = "20";
private String logChannelId;
private String threadId;
private String logType;
/***定义全局的日志ID*/
private String logId;
private StringBuilder logStr = new StringBuilder();
private Object obj = null;
public XLogListener() {
}
public XLogListener(String logFilePath, Object obj, KettleJob kettleJob) throws KettleException {
this.logFilePath = logFilePath;
this.obj = obj;
this.kettleJob = kettleJob;
setOutputStream(addLogListener(logFilePath, obj,kettleJob));
this.setLogId(GenCodeUtil.nextId());
if (obj instanceof Job) {
Job job = (Job) obj;
this.setLogType("job");
this.setLogChannelId(job.getLogChannelId() +'_'+ kettleJob.getId());
this.setThreadId(kettleJob.getId().toString());
} else {
Trans trans = (Trans) obj;
this.setLogType("trans");
this.setLogChannelId(trans.getLogChannelId());
this.setThreadId(trans.getObjectId().getId());
}
}
public boolean writeFileLog(KettleLoggingEvent event,KettleJob kettleJob) {
try {
Object messageObject = event.getMessage();
if (messageObject instanceof LogMessage) {
boolean logToFile = false;
if (this.getLogChannelId() == null) {
logToFile = true;
} else {
LogMessage message = (LogMessage) messageObject;
List<String> logChannelChildren = LoggingRegistry.getInstance().getLogChannelChildren(this.getLogChannelId());
logToFile = Const.indexOfString(message.getLogChannelId(), logChannelChildren) >= 0;
}
if (logToFile) {
String logText = this.layout.format(event);
this.getOutputStream().write(logText.getBytes());
this.getOutputStream().write(Const.CR.getBytes());
return true;
}
}
} catch (IOException e) {
log.error("写入日志出现异常,原因为:", e);
}
return false;
}
public boolean writeDbLog(KettleLoggingEvent event,KettleJob kettleJob) {
LogService logService = (LogService) SpringContextUtil.getBean("logService", LogService.class);
try {
Object messageObject = event.getMessage();
if (messageObject instanceof LogMessage) {
boolean logToDb = false;
if (this.getLogChannelId() == null) {
logToDb = true;
} else {
LogMessage message = (LogMessage) messageObject;
List<String> logChannelChildren = LoggingRegistry.getInstance().getLogChannelChildren(this.getLogChannelId());
logToDb = Const.indexOfString(message.getLogChannelId(), logChannelChildren) >= 0;
}
if (logToDb) {
String logText = this.layout.format(event);
String type = "";
//取默认值 正在运行中
String recordStatus = "";
String startTime = "";
String endTime = "";
String logFile = "";
if (obj instanceof Job) {
Job job = (Job) obj;
type = "job";
startTime = startTimeMap.get(this.getThreadId());
logFile = this.jobFileMap.get(job).getAbsolutePath();
recordStatus = KettleUtil.getJobStatus(job).value();
} else if (obj instanceof Trans) {
Trans trans = (Trans) obj;
type = "trans";
startTime = startTimeMap.get(this.getThreadId());
logFile = this.jobFileMap.get(trans).getAbsolutePath();
recordStatus = KettleUtil.getTransStatus(trans).value();
}
/**保存日志执行记录到数据库*/
String logId = this.getLogId();
/* logService.addLog(logId, this.getThreadId(), type, recordStatus,
logFile,
this.logStr.append(logText).append((char) 13).append((char) 10).toString(),
startTime, endTime);*/
return true;
}
}
} catch (Exception e) {
log.error("写入日志出现异常,原因为:", e);
}
return false;
}
public void recordWarningLog(KettleLoggingEvent event,KettleJob kettleJob) {
LogService logService = (LogService) SpringContextUtil.getBean("logService", LogService.class);
Object object = event.getMessage();
LogMessage message = (LogMessage) object;
String joblogStr = message.getMessage();
pattern = Pattern.compile("(error)");
Matcher m = pattern.matcher(joblogStr);
if (m.find() || message.getLevel().isError()) {
String msg = getExceptionMsg(joblogStr, m);
String logLevel = message.getLevel().getLevel() + "";
String error = String.valueOf(message.isError());
String subject = message.getSubject();
String logChannel = message.getLogChannelId();
String logFile = this.logFilePath;
String targetId = threadId;
String type = "";
String targetName = "未知线程任务:" + Thread.currentThread().getName();
if (obj instanceof Job) {
Job job = (Job) obj;
logFile = jobFileMap.get(job).getAbsolutePath();
targetId = kettleJob.getId().toString();
targetName = kettleJob.getJobName();
type = "job";
} else if (obj instanceof Trans) {
Trans trans = (Trans) obj;
logFile = jobFileMap.get(trans).getAbsolutePath();
targetId = trans.getObjectId().getId();
targetName = trans.getTransMeta().getName();
type = "trans";
}
logService.doAddLogWarning(logChannel, targetId, targetName, logFile, error, msg, subject, logLevel, type);
log.info("异常日志已保存入库!");
}
}
@Override
public void eventAdded(KettleLoggingEvent event) {
eventAdded( event, kettleJob);
}
public void eventAdded(KettleLoggingEvent event,KettleJob kettleJob) {
boolean failed = true;
try {
synchronized (new Object()) {
if (writeFileLog(event,kettleJob)) {
failed = false;
try {
getOutputStream().flush();
recordWarningLog(event,kettleJob);
/**同时写数据库*/
Thread.sleep(50);
writeDbLog(event,kettleJob);
} catch (Exception ex) {
failed = true;
log.error("日志文件写入出现异常,原因:{}", ex);
}
}
}
if (failed) {
log.debug("因为异常丢失的日志{}", JSON.toJSON(event));
recordWarningLog(event,kettleJob);
}
} catch (Exception ex) {
log.error("作业日志处理失败{},原因为,{}", JSON.toJSONString(event), ex);
}
}
public OutputStream addLogListener(String logPath, Object obj,KettleJob kettleJob) throws KettleException {
log.info("任务{}日志监听启动了,日志路径{}...", obj, logPath);
logFilePath = logPath;
String target;
String targetName;
if (obj instanceof Job) {
Job job = (Job) obj;
target = "job";
targetName = kettleJob.getId().toString()+"_"+kettleJob.getJobName();
activeThreadMap.put(kettleJob.getId().toString(), job);
startTimeMap.put(kettleJob.getId().toString(), DateHelper.format(new Date()));
} else {
Trans trans = (Trans) obj;
target = "trans";
targetName = trans.getTransMeta().getName();
activeThreadMap.put(trans.getObjectId().getId(), trans);
startTimeMap.put(trans.getObjectId().getId(), DateHelper.format(new Date()));
}
try {
File file = getLogFile(target, targetName);
if (file == null) {
throw new KettleException("必须指定日志文件物理路径!");
}
jobFileMap.put(obj, file);
return new FileOutputStream(file, true);
} catch (KettleException e) {
throw new KettleException("出现异常,原因:" + e);
} catch (FileNotFoundException e) {
throw new KettleException("出现异常,原因:" + e);
}
}
private File getLogFile(String target, String targetName) {
File file = null;
synchronized (new Object()) {
/**如果定义了日志存储的物理路径,则将日志写入到磁盘一份*/
if (StringUtils.isNotBlank(logFilePath)) {
logFilePath = logFilePath.replaceAll("\\\\", "\\/");
file = new File(logFilePath + "/" + target + "/" + targetName + "/");
if (!file.exists()) {
file.mkdirs();
}
StringBuilder logFilePathString = new StringBuilder();
logFilePathString
.append(file.getAbsolutePath()).append("/")
.append(new SimpleDateFormat("yyyyMMddHHmmssSSS").format(new Date()))
.append(".")
.append("txt");
file = new File(logFilePathString.toString());
if (!file.exists()) {
try {
file.createNewFile();
} catch (IOException e) {
log.error("创建文件出现异常,原因为:", e);
}
}
}
}
return file;
}
public String getLogChannelId() {
return logChannelId;
}
public void setLogChannelId(String logChannelId) {
this.logChannelId = logChannelId;
}
public String getThreadId() {
return threadId;
}
public void setThreadId(String threadId) {
this.threadId = threadId;
}
public String getLogType() {
return logType;
}
public void setLogType(String logType) {
this.logType = logType;
}
public String getLogId() {
return logId;
}
public void setLogId(String logId) {
this.logId = logId;
}
public void close() throws KettleException {
if(outputStream != null){
try {
outputStream.close();
} catch (IOException e) {
throw new KettleException(e);
}
}
}
public OutputStream getOutputStream() {
return outputStream;
}
public void setOutputStream(OutputStream outputStream) {
this.outputStream = outputStream;
}
public String getExceptionMsg(String joblogStr, Matcher m) {
if (joblogStr.length() <= 3000) {
return joblogStr;
} else if (!m.find()) {
return joblogStr.substring(0, 3000);
} else if (m.start() <= 100) {
return joblogStr.substring(0, 3000);
} else {
return joblogStr.length() - m.start() + 100 <= 3000 ? joblogStr.substring(m.start() - 100) : joblogStr.substring(m.start() - 100, m.start() + 2900);
}
}
}
修改了 this.setLogChannelId(job.getLogChannelId() +'_'+ kettleJob.getId());
运行后报错如上
最新发布