09-日志工厂

博客介绍了MyBatis的日志工厂,指出日志对数据库操作排错很重要。讲解了STDOUT_LOGGING标准日志输出,需在MyBatis核心配置文件中配置。还介绍了Log4j,它是Apache开源项目,可灵活控制日志,包括输出目的地、格式等,并给出了简单使用步骤。

日志工厂

如果一个数据库操作,出现了异常,我们需要排错,日志就是最好的助手!

曾经:sout、debug

现在:日志工厂!

  • LOG4J
  • STDOUT_LOGGING

STDOUT_LOGGING讲解

在Mybatis中具体使用那个一日志实现,在设置中设定!

STDOUT_LOGGING标准日志输出

在mybatis核心配置文件中,配置我们的日志!

<settings>
    <!--    标准的日志工厂实现 -->
    <setting name="logImpl" value="STDOUT_LOGGING"/>
</settings>

直接运行

Opening JDBC Connection
Created connection 195228908.
Setting autocommit to false on JDBC Connection [com.mysql.cj.jdbc.ConnectionImpl@ba2f4ec]
==>  Preparing: select * from mybatis.user where id=? 
==> Parameters: 1(Integer)
<==    Columns: id, name, pwd
<==        Row: 1, 狂神, 123456
<==      Total: 1
User{id=1, name='狂神', password='123456'}
Resetting autocommit to true on JDBC Connection [com.mysql.cj.jdbc.ConnectionImpl@ba2f4ec]
Closing JDBC Connection [com.mysql.cj.jdbc.ConnectionImpl@ba2f4ec]
Returned connection 195228908 to pool.

LOG4J讲解

什么是Log4j?

  • Log4j是Apache的一个开源项目,通过使用Log4j,我们可以控制日志信息输送的目的地是控制台、文件、GUI组件
  • 我们也可以控制每一条日志的输出格式;
  • 通过定义每一条日志信息的级别,我们能够更加细致地控制日志的生成过程,
  • 通过一个配置文件来灵活地进行配置,而不需要修改应用的代码。

1.先导包

 <dependency>
        <groupId>log4j</groupId>
        <artifactId>log4j</artifactId>
        <version>1.2.17</version>
    </dependency>

2.log4j.properties

#将等级DEBUG的日志信息输出到console和fiLe这两个目的地,console和file的定义在下面的代码
log4j.rootLogger=DEBUG,console,file
#控制台输出的相关设置
log4j.appender.console = org.apache.log4j.ConsoleAppender
log4j.appender.console.Target = System.out
log4j.appender.console.Threshold=DEBUG
log4j.appender.console.layout = org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=[%c]-%m%n
#文件输出的相关设置
log4j.appender.file = org.apache.log4j.RollingFileAppender
log4j.appender.file.File=./log/kuang.log
log4j.appender.file.MaxFileSize=10mb
log4j.appender.file.Threshold=DEBUG
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern=[%p][%d{yy-MM-dd}][%c]%m%n
#日志输出缓别
log4j.logger.org-mybatis=DEBUG
log4j.logger.java.sql=DEBUG
log4j.logger-java.sql.Statement=DEBUG
log4j.logger-java.sql.ResultSet=DEBUG
log4j.logger.java.sql.PreparedStatement=DEBUG


3.配置log4j为日志的实现

<settings>
    <setting name="logImpl" value="LOG4J"/>
</settings>

4.Log4J的使用,直接测试运行刚才的查询

[org.apache.ibatis.logging.LogFactory]-Logging initialized using 'class org.apache.ibatis.logging.log4j.Log4jImpl' adapter.
[org.apache.ibatis.logging.LogFactory]-Logging initialized using 'class org.apache.ibatis.logging.log4j.Log4jImpl' adapter.
[org.apache.ibatis.datasource.pooled.PooledDataSource]-PooledDataSource forcefully closed/removed all connections.
[org.apache.ibatis.datasource.pooled.PooledDataSource]-PooledDataSource forcefully closed/removed all connections.
[org.apache.ibatis.datasource.pooled.PooledDataSource]-PooledDataSource forcefully closed/removed all connections.
[org.apache.ibatis.datasource.pooled.PooledDataSource]-PooledDataSource forcefully closed/removed all connections.
[org.apache.ibatis.transaction.jdbc.JdbcTransaction]-Opening JDBC Connection
[org.apache.ibatis.datasource.pooled.PooledDataSource]-Created connection 1840903588.
[org.apache.ibatis.transaction.jdbc.JdbcTransaction]-Setting autocommit to false on JDBC Connection [com.mysql.cj.jdbc.ConnectionImpl@6db9f5a4]
[com.tian.dao.UserMapper.getUserById]-==>  Preparing: select * from mybatis.user where id=? 
[com.tian.dao.UserMapper.getUserById]-==> Parameters: 1(Integer)
[com.tian.dao.UserMapper.getUserById]-<==      Total: 1
User{id=1, name='狂神', password='123456'}
[org.apache.ibatis.transaction.jdbc.JdbcTransaction]-Resetting autocommit to true on JDBC Connection [com.mysql.cj.jdbc.ConnectionImpl@6db9f5a4]
[org.apache.ibatis.transaction.jdbc.JdbcTransaction]-Closing JDBC Connection [com.mysql.cj.jdbc.ConnectionImpl@6db9f5a4]
[org.apache.ibatis.datasource.pooled.PooledDataSource]-Returned connection 1840903588 to pool.

简单使用

1.在要使用Log4j的类中,导入包 import org.apache.log4j.Logger;

2.日志对象,参数为当前类的class

static Logger logger=Logger.getLogger(UserMapperTest.class);

3.日志级别

    		logger.info("info:进入了testLog4j");
            logger.debug("debug:进入了testLog4j");
            logger.error("error:进入了testLog4j");

4.代码演示

package com.tian.dao;

import com.tian.pojo.User;
import com.tian.utils.MybatisUtils;
import org.apache.ibatis.session.SqlSession;
import org.apache.log4j.Logger;
import org.junit.Test;

import java.util.List;

public class UserMapperTest {
    static Logger logger=Logger.getLogger(UserMapperTest.class);

    @Test
    public void test(){
        //获得sqlsession对象
        SqlSession sqlSession = MybatisUtils.getSqlSession();
        logger.info("-------------------------");
        //方式一:getMapper  执行sql
        UserMapper userMapper = sqlSession.getMapper(UserMapper.class);
        User user = userMapper.getUserById(1);
        System.out.println(user);
        //关闭sqlsession
        sqlSession.close();
    }

    @Test
    public void testLog4j(){
            logger.info("info:进入了testLog4j");
            logger.debug("debug:进入了testLog4j");
            logger.error("error:进入了testLog4j");
    }

}

[2025-09-09 09:07:32,903][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/HiviewPlatform.py:78][__init__] uuid_path -> /tmp/Hubble/LogkitWeb/AnalyseTask_web2EMUI/fa414108-60e0-4574-af3b-7bca73e8895e [2025-09-09 09:07:32,904][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/HiviewPlatform.py:79][__init__] uuid_str -> fa414108-60e0-4574-af3b-7bca73e8895e [2025-09-09 09:07:33,457][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/HiviewPlatform.py:100][start_analysis] ============LYLYLY=============== [2025-09-09 09:07:33,457][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/HiviewPlatform.py:101][start_analysis] ============task_info=============== [2025-09-09 09:07:33,458][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/HiviewPlatform.py:102][start_analysis] task_info -> {'resultTopic': 'AnalyseTask_EMUI2web', 'topic': 'AnalyseTask_web2EMUI', 'common_info': {'syn_id': '', 'source_description': '', 'analysis_tool': ['Hubble'], 'extended_info': {}, 'happen_time': '', 'field': '', 'field2': '', 'betano': '', 'task_class': '', 'source': 'LogkitWeb', 'version': '', 'dtsno': '', 'eventid': '', 'product': '', 'log_download_list': [[['hilog.330.20250815-230805 (2).zip', 'https://aresocean.huawei.com/hubble/hubblesvc/hubbleportal/report/downloadLogByDate?fileName=312b6953eac74e03bb5605091ddc2f2b_hilog.330.20250815-230805+%282%29.zip&date=20250909', '']]], 'errorcode1': '', 'errorcode3': '', 'errorcode2': '', 'errorcode4': '', 'batch_or_not': False, 'logsource': 'BetaClub', 'task_uuid': 'fa414108-60e0-4574-af3b-7bca73e8895e', 'username': 'HubbleTest', 'logtype': '', 'sn': '', 'beta_com': 'beta'}, 'user_defined_info': {'ReliabilityWebTool': {'eventid': '', 'remark': '', 'chip_platform': '', 'betatype': '', 'times': '', 'project': '', 'eid': '', 'faulttype': '', 'conclusion': ''}, 'Hubble': {'analysis_type': ['analysis'], 'selfvali': '', 'analysis_field': '', 'dev': 'False', 'script_vali': {'fault_name': '单框架稳定与流畅', 'input_interface_demo': '', 'input_interface_standard': '', 'script_name': 'eqr7plLp_stability_script_third_NEXT.py', 'script_url': '/opt/samba3/ScriptValidation/20250909/5zbAIA9f_eqr7plLp_stability_script_third_NEXT.py', 'fault_en': ''}}}, 'filename': None, 'filepath': None, 'runfilename': None, 'runfilepath': None, 'device': '', 'callingsource': 'LogkitWeb', 'logsource': 'BetaClub', 'product': '', 'version': '', 'time': None, 'field': '', 'errorcode1': '', 'errorcode2': '', 'errorcode3': '', 'errorcode4': '', 'logfiles_url': [{'url': 'https://aresocean.huawei.com/hubble/hubblesvc/hubbleportal/report/downloadLogByDate?fileName=312b6953eac74e03bb5605091ddc2f2b_hilog.330.20250815-230805+%282%29.zip&date=20250909', 'name': 'hilog.330.20250815-230805 (2).zip'}], 'description': None, 'userinfo': None, 'uuid': 'fa414108-60e0-4574-af3b-7bca73e8895e', 'log_consume_date': None, 'uploadfiles': [], 'ocean_dest': '/opt/samba3/20250909/16/fa414108-60e0-4574-af3b-7bca73e8895e'} [2025-09-09 09:07:33,458][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/HiviewPlatform.py:103][start_analysis] ====task_info结束======== [2025-09-09 09:07:33,458][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/HiviewPlatform.py:105][start_analysis] uuid -> /tmp/Hubble/LogkitWeb/AnalyseTask_web2EMUI/fa414108-60e0-4574-af3b-7bca73e8895e [2025-09-09 09:07:33,458][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/HiviewPlatform.py:106][start_analysis] =======uuid结束============== [2025-09-09 09:07:33,614][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/HiviewAutomata.py:102][unpack_task_info] Kafka result topic is : [AnalyseTask_EMUI2web] [2025-09-09 09:07:33,778][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/HiviewPlatform.py:48][wrapper] 当前函数 [receive_task] 运行时间 0.16 秒 [2025-09-09 09:07:33,934] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/LogExtractionRecordUtils.py] [line:167] [insert_analyse_runtime_data] hubble_runtime_measurement 缺失日志基础信息 type:1 data:{'model': 'receive_task', 's': 1757380053, 'e': 1757380053, 'd': 0} [2025-09-09 09:07:33,935][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/HiviewKafka.py:179][send_msg_to_web] 发送Kafka消息的Topic: [AnalyseTask_EMUI2web], msg_dict: [{'quesid': 'fa414108-60e0-4574-af3b-7bca73e8895e', 'resultpath': '/opt/samba3/20250909/16/fa414108-60e0-4574-af3b-7bca73e8895e', 'result': -2, 'updatefiles': '', 'error_detail': '', 'processflag': ''}] [2025-09-09 09:07:34,574][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/doorman/LogDownloader.py:162][stream_download] 日志下载完成, 路径名称: [/tmp/Hubble/LogkitWeb/AnalyseTask_web2EMUI/fa414108-60e0-4574-af3b-7bca73e8895e/hilog.330.20250815-230805 (2).zip] [2025-09-09 09:07:34,578][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/HiviewKafka.py:179][send_msg_to_web] 发送Kafka消息的Topic: [AnalyseTask_EMUI2web], msg_dict: [{'quesid': 'fa414108-60e0-4574-af3b-7bca73e8895e', 'resultpath': '/opt/samba3/20250909/16/fa414108-60e0-4574-af3b-7bca73e8895e', 'result': -3, 'updatefiles': '', 'error_detail': '', 'processflag': ''}] [2025-09-09 09:07:34,968][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/doorman/LogUnzipper.py:69][unzip] 解压完成, 顶层路径:[/tmp/Hubble/LogkitWeb/AnalyseTask_web2EMUI/fa414108-60e0-4574-af3b-7bca73e8895e] [2025-09-09 09:07:34,970][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/HiviewPlatform.py:48][wrapper] 当前函数 [download_unzip] 运行时间 1.04 秒 [2025-09-09 09:07:35,124] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/LogExtractionRecordUtils.py] [line:167] [insert_analyse_runtime_data] hubble_runtime_measurement 缺失日志基础信息 type:1 data:{'model': 'download_unzip', 's': 1757380053, 'e': 1757380054, 'd': 1} [2025-09-09 09:07:35,135][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/manager/LogLobbyManager.py:41][read_basics] 增加性能工厂habor_data_param参数 [2025-09-09 09:07:35,136] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/lobby/LogInfoEnhancer.py] [line:631] [forgive_unregular_filename] 非正常日志名强行解析完成[hilog.330.20250815-230805 (2).zip] [2025-09-09 09:07:35,139][ERROR][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/BigDataQueryUtils.py:148][query_betaclub_info] query_betaclub_info no: failed! [2025-09-09 09:07:35,139][ERROR][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/BigDataQueryUtils.py:149][query_betaclub_info] invalid literal for int() with base 10: '' Traceback (most recent call last): File "/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/BigDataQueryUtils.py", line 121, in query_betaclub_info beta_no_int = int(beta_no) ValueError: invalid literal for int() with base 10: '' [2025-09-09 09:07:35,140] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/lobby/LogInfoEnhancer.py] [line:911] [set_betaclub_info] query_betaclub_info None! [2025-09-09 09:07:35,140] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/HubbleServerApiUtils.py] [line:51] [run] ------------------- start HubbleServerApiUtils query --------------------------- [2025-09-09 09:07:35,141] [ERROR] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/HubbleServerApiUtils.py] [line:323] [query_platform] query_platform , product: [2025-09-09 09:07:35,153] [ERROR] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/HubbleServerApiUtils.py] [line:330] [query_platform] query_platform失败, e: list index out of range [2025-09-09 09:07:35,154] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/HubbleServerApiUtils.py] [line:61] [run] -----查询耗时:0.013098955154418945 [2025-09-09 09:07:35,154] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/HubbleServerApiUtils.py] [line:62] [run] --------------------- end HubbleServerApiUtils query ------------------------- [2025-09-09 09:07:35,155] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/lobby/LogInfoEnhancer.py] [line:941] [set_platform_info] query_platform, product: , platform: [2025-09-09 09:07:35,157][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/HiviewAutomata.py:681][copyFile2ocean] copy2ocean tmp_path:/tmp/Hubble/LogkitWeb/AnalyseTask_web2EMUI/fa414108-60e0-4574-af3b-7bca73e8895e ocean_path:/opt/samba3/20250909/16/fa414108-60e0-4574-af3b-7bca73e8895e file_name:loginfo.json [2025-09-09 09:07:35,157][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/HiviewAutomata.py:686][copyFile2ocean] !copying from /tmp/Hubble/LogkitWeb/AnalyseTask_web2EMUI/fa414108-60e0-4574-af3b-7bca73e8895e/loginfo.json to /opt/samba3/20250909/16/fa414108-60e0-4574-af3b-7bca73e8895e [2025-09-09 09:07:35,209][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/HiviewAutomata.py:688][copyFile2ocean] copy status:0 [2025-09-09 09:07:35,210][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/HiviewKafka.py:179][send_msg_to_web] 发送Kafka消息的Topic: [AnalyseTask_EMUI2web], msg_dict: [{'quesid': 'fa414108-60e0-4574-af3b-7bca73e8895e', 'resultpath': '/opt/samba3/20250909/16/fa414108-60e0-4574-af3b-7bca73e8895e', 'result': 100, 'updatefiles': '', 'error_detail': '', 'processflag': ''}] [2025-09-09 09:07:35,541] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/scripts/plugins/plugin_tools/PerformanceCommonAnalyse/fault_scene_script/database.py] [line:469] [create_sqlite_db] 本地db数据创建成功,db路径为:/tmp/Hubble/LogkitWeb/AnalyseTask_web2EMUI/fa414108-60e0-4574-af3b-7bca73e8895e/perf_data.db, 表名为:hubble_aa_perf_resource_t [2025-09-09 09:07:35,542][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/HiviewPlatform.py:171][prepare] perf_data.db创建成功,路径为/tmp/Hubble/LogkitWeb/AnalyseTask_web2EMUI/fa414108-60e0-4574-af3b-7bca73e8895e/perf_data.db [2025-09-09 09:07:35,542][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/HiviewAutomata.py:296][extract_operation_info_from_logs] 对用户使用场景及焦点窗口抽取 [2025-09-09 09:07:35,704][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/user_descriptions_info/user_logs_operation.py:815][start_extraction] user_operation_sequence start_extraction... [2025-09-09 09:07:35,705][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/user_descriptions_info/user_logs_operation.py:824][start_extraction] 当前故障时间为: [2025-09-09 09:07:35,705][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/user_descriptions_info/user_logs_operation.py:829][start_extraction] 当前年份为:2025 [2025-09-09 09:07:35,705][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/user_descriptions_info/user_logs_operation.py:69][get_file] ======>获取日志文件 [2025-09-09 09:07:35,705][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/user_descriptions_info/user_logs_operation.py:162][find_log] ======>获取相关日志 [2025-09-09 09:07:35,706][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/user_descriptions_info/user_logs_operation.py:151][get_scene_sequence_by_focus] 通过focus关键字未找到场景数据,通过wm关键字查询场景数据 [2025-09-09 09:07:35,707][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/user_descriptions_info/user_logs_operation.py:387][handel_user_operation] 未获取到场景分析资源数据 [2025-09-09 09:07:35,707] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/scripts/plugins/plugin_tools/PerformanceCommonAnalyse/PerfLogExtractorUtils.py] [line:151] [insert_Resources_data] 开始往数据库发送['原始日志关键字校验']数据 [2025-09-09 09:07:35,936][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/HiviewKafka.py:126][producer_batch_send] kafka发送时间:0.22808289527893066 [2025-09-09 09:07:35,938] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/scripts/plugins/plugin_tools/PerformanceCommonAnalyse/PerfLogExtractorUtils.py] [line:164] [insert_Resources_data] ['原始日志关键字校验']发送数据量:2 [2025-09-09 09:07:35,938] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/scripts/plugins/plugin_tools/PerformanceCommonAnalyse/PerfLogExtractorUtils.py] [line:184] [insert_Resources_data] type_id为36,不需要存入到本地perf_data.db中 [2025-09-09 09:07:35,938][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/user_descriptions_info/user_logs_operation.py:778][check_if_screen_record_exist] 提前提取录屏信息 [2025-09-09 09:07:35,939][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/HiviewPlatform.py:175][prepare] 提取用户操作序列耗时:0.4006965160369873 [2025-09-09 09:07:35,939][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/TaskAssembler.py:133][check_extraction_analyse] check_extraction_analyse cur key is SYSTRACE_EXTRACTION_ANALYSE [2025-09-09 09:07:35,942][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/HiviewAutomata.py:681][copyFile2ocean] copy2ocean tmp_path:/tmp/Hubble/LogkitWeb/AnalyseTask_web2EMUI/fa414108-60e0-4574-af3b-7bca73e8895e ocean_path:/opt/samba3/20250909/16/fa414108-60e0-4574-af3b-7bca73e8895e file_name:loginfo.json [2025-09-09 09:07:35,942][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/HiviewAutomata.py:686][copyFile2ocean] !copying from /tmp/Hubble/LogkitWeb/AnalyseTask_web2EMUI/fa414108-60e0-4574-af3b-7bca73e8895e/loginfo.json to /opt/samba3/20250909/16/fa414108-60e0-4574-af3b-7bca73e8895e [2025-09-09 09:07:35,965][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/HiviewAutomata.py:688][copyFile2ocean] copy status:0 [2025-09-09 09:07:35,965][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/HiviewKafka.py:179][send_msg_to_web] 发送Kafka消息的Topic: [AnalyseTask_EMUI2web], msg_dict: [{'quesid': 'fa414108-60e0-4574-af3b-7bca73e8895e', 'resultpath': '/opt/samba3/20250909/16/fa414108-60e0-4574-af3b-7bca73e8895e', 'result': 100, 'updatefiles': '', 'error_detail': '', 'processflag': ''}] [2025-09-09 09:07:36,287] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/frontdesk/PluginValidation.py] [line:90] [download_py_script] 按绝对路径拷贝脚本文件[/opt/samba3/ScriptValidation/20250909/5zbAIA9f_eqr7plLp_stability_script_third_NEXT.py] [2025-09-09 09:07:36,310] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/frontdesk/PluginValidation.py] [line:98] [download_py_script] 成功拷贝文件[eqr7plLp_stability_script_third_NEXT.py], 下载路径[/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/scripts/plugins/plugin_test/eqr7plLp_stability_script_third_NEXT.py] [2025-09-09 09:07:36,316][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/frontdesk/LogCheckin.py:132][single_test_register] Reload [<module 'Hubble.scripts.plugins.plugin_test.eqr7plLp_stability_script_third_NEXT' from '/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/scripts/plugins/plugin_test/eqr7plLp_stability_script_third_NEXT.py'>]... [2025-09-09 09:07:36,318][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/frontdesk/LogCheckin.py:139][single_test_register] 测试Plugin[eqr7plLp_stability_script_third_NEXT] is ready. [2025-09-09 09:07:36,318] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/scripts/plugins/plugin_test/eqr7plLp_stability_script_third_NEXT.py] [line:220] [plugin_preprocess] 这是预处理函数? [2025-09-09 09:07:36,319][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/HiviewPlatform.py:48][wrapper] 当前函数 [prepare] 运行时间 1.19 秒 [2025-09-09 09:07:36,473] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/LogExtractionRecordUtils.py] [line:167] [insert_analyse_runtime_data] hubble_runtime_measurement 缺失日志基础信息 type:1 data:{'model': 'prepare', 's': 1757380055, 'e': 1757380056, 'd': 1} [2025-09-09 09:07:36,476] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/manager/LogDeserializeManager.py] [line:904] [decrypt_isp_log] strart decrypt_isp_log ! [2025-09-09 09:07:36,476] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/manager/LogDeserializeManager.py] [line:944] [decrypt_isp_log] end decrypt_isp_log ! [2025-09-09 09:07:36,477][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/HiviewPlatform.py:48][wrapper] 当前函数 [deserialize] 运行时间 0.00 秒 [2025-09-09 09:07:36,628] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/LogExtractionRecordUtils.py] [line:169] [insert_analyse_runtime_data] 开始往数据库发送 1 数据 [2025-09-09 09:07:37,464] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/LogExtractionRecordUtils.py] [line:177] [insert_analyse_runtime_data] 发送 【1】类型 数据量:1 [2025-09-09 09:07:37,466][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/TaskAssembler.py:185][generate_task_queue] adding task FILTER_HISTORY_LOGS [2025-09-09 09:07:37,466][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/TaskAssembler.py:194][generate_task_queue] adding task ANALYSIS_ONLY [2025-09-09 09:07:37,467][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/HiviewPlatform.py:48][wrapper] 当前函数 [run_before] 运行时间 0.00 秒 [2025-09-09 09:07:37,621] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/LogExtractionRecordUtils.py] [line:169] [insert_analyse_runtime_data] 开始往数据库发送 1 数据 [2025-09-09 09:07:37,941] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/LogExtractionRecordUtils.py] [line:177] [insert_analyse_runtime_data] 发送 【1】类型 数据量:1 [2025-09-09 09:07:37,941][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/HiviewKafka.py:179][send_msg_to_web] 发送Kafka消息的Topic: [AnalyseTask_EMUI2web], msg_dict: [{'quesid': 'fa414108-60e0-4574-af3b-7bca73e8895e', 'resultpath': '/opt/samba3/20250909/16/fa414108-60e0-4574-af3b-7bca73e8895e', 'result': -4, 'updatefiles': '', 'error_detail': '', 'processflag': ''}] [2025-09-09 09:07:38,267][DEBUG][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/manager/LogWaiterManager.py:92][run_task_queue] orgfile_path : /opt/samba3/of10/20250909/fa414108-60e0-4574-af3b-7bca73e8895e [2025-09-09 09:07:38,268][WARNING][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/TaskRunner.py:104][run] 执行任务: [FILTER_HISTORY_LOGS] [2025-09-09 09:07:38,425][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/TaskRunner.py:164][filter_history_logs] start filter_history_logs [2025-09-09 09:07:38,428][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/TaskRunner.py:180][filter_history_logs] end filter_history_logs [2025-09-09 09:07:38,428] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/LogExtractionRecordUtils.py] [line:169] [insert_analyse_runtime_data] 开始往数据库发送 2 数据 [2025-09-09 09:07:38,659] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/LogExtractionRecordUtils.py] [line:177] [insert_analyse_runtime_data] 发送 【2】类型 数据量:1 [2025-09-09 09:07:38,660][WARNING][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/TaskRunner.py:104][run] 执行任务: [ANALYSIS_ONLY] [2025-09-09 09:07:38,977][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/HiviewKafka.py:179][send_msg_to_web] 发送Kafka消息的Topic: [AnalyseTask_EMUI2web], msg_dict: [{'quesid': 'fa414108-60e0-4574-af3b-7bca73e8895e', 'resultpath': '/opt/samba3/20250909/16/fa414108-60e0-4574-af3b-7bca73e8895e', 'result': -6, 'updatefiles': '', 'error_detail': '', 'processflag': 'start_locateanalysis'}] [2025-09-09 09:07:39,318][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/analysis/LogAnalyst.py:95][start_analysis] 插件[eqr7plLp_stability_script_third_NEXT]的单日志分析逻辑开始执行 [2025-09-09 09:07:39,319] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/scripts/plugins/plugin_test/eqr7plLp_stability_script_third_NEXT.py] [line:258] [run_plugin_single] 这是测试插件!uuid_log_path=[/tmp/Hubble/LogkitWeb/AnalyseTask_web2EMUI/fa414108-60e0-4574-af3b-7bca73e8895e], log_path=[/tmp/Hubble/LogkitWeb/AnalyseTask_web2EMUI/fa414108-60e0-4574-af3b-7bca73e8895e], param_dict=[{'common_info': {'syn_id': '', 'source_description': '', 'analysis_tool': ['Hubble'], 'extended_info': {}, 'happen_time': '', 'field': 'Unknown', 'field2': '', 'betano': '', 'task_class': '', 'source': 'LogkitWeb', 'version': '', 'dtsno': '', 'eventid': '', 'product': '', 'log_download_list': [[['hilog.330.20250815-230805 (2).zip', 'https://aresocean.huawei.com/hubble/hubblesvc/hubbleportal/report/downloadLogByDate?fileName=312b6953eac74e03bb5605091ddc2f2b_hilog.330.20250815-230805+%282%29.zip&date=20250909', '']]], 'errorcode1': '', 'errorcode3': '', 'errorcode2': '', 'errorcode4': '', 'batch_or_not': False, 'logsource': 'BetaClub', 'task_uuid': 'fa414108-60e0-4574-af3b-7bca73e8895e', 'username': 'HubbleTest', 'logtype': '', 'sn': '', 'beta_com': 'beta', 'run_by_auto': True, 'issue_id': '', 'log_source': 'BetaClub', 'datetime_str': '', 'device': 'PHONE', 'use_scope': '', 'is_pro': True, 'happen_time_unixtime': 0, 'happen_date_sta': 0, 'happen_date_end': 99999999, 'Graphics_path': '/tmp/Hubble/LogkitWeb/AnalyseTask_web2EMUI/fa414108-60e0-4574-af3b-7bca73e8895e/Graphics.json'}, 'user_defined_files': '/tmp/Hubble/LogkitWeb/AnalyseTask_web2EMUI/fa414108-60e0-4574-af3b-7bca73e8895e/user_defined_files/eqr7plLp_stability_script_third_NEXT', 'timeout_info': [], 'faulttag_has_result': False}] [2025-09-09 09:07:40,795][ERROR][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/analysis/LogAnalyst.py:152][start_analysis] 插件[eqr7plLp_stability_script_third_NEXT]的分析逻辑失败, [invalid literal for int() with base 10: '08-15 23:09:40.181'] Traceback (most recent call last): File "/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/analysis/LogAnalyst.py", line 117, in start_analysis plugin_single_result = plugin_modules[plugin_name].run_plugin_single(uuid_path, log_path, plugin_params) # analysis File "/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/scripts/plugins/plugin_test/eqr7plLp_stability_script_third_NEXT.py", line 291, in run_plugin_single power_low = self.power_low(app_list) File "/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/scripts/plugins/plugin_test/eqr7plLp_stability_script_third_NEXT.py", line 178, in power_low if int(power_value) < 10: ValueError: invalid literal for int() with base 10: '08-15 23:09:40.181' [2025-09-09 09:07:40,798] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/LogExtractionRecordUtils.py] [line:142] [insert_task_result_data] 开始往数据库发送0数据 [2025-09-09 09:07:41,128][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/HiviewKafka.py:126][producer_batch_send] kafka发送时间:0.3298799991607666 [2025-09-09 09:07:41,131] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/LogExtractionRecordUtils.py] [line:156] [insert_task_result_data] 发送 【0】类型 数据量:1 [2025-09-09 09:07:41,131] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/LogExtractionRecordUtils.py] [line:119] [insert_fault_analyse_data] 开始往数据库发送analyse_script_result数据 [2025-09-09 09:07:41,455][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/HiviewKafka.py:126][producer_batch_send] kafka发送时间:0.32396483421325684 [2025-09-09 09:07:41,457] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/LogExtractionRecordUtils.py] [line:130] [insert_fault_analyse_data] 发送 【analyse_script_result】类型 数据量:1 [2025-09-09 09:07:41,457] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/LogExtractionRecordUtils.py] [line:169] [insert_analyse_runtime_data] 开始往数据库发送 3 数据 [2025-09-09 09:07:41,779] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/LogExtractionRecordUtils.py] [line:177] [insert_analyse_runtime_data] 发送 【3】类型 数据量:1 [2025-09-09 09:07:41,784][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/TaskRunner.py:214][log_analysis_only] !copying from /tmp/Hubble/LogkitWeb/AnalyseTask_web2EMUI/fa414108-60e0-4574-af3b-7bca73e8895e/loginfo.json to /opt/samba3/20250909/16/fa414108-60e0-4574-af3b-7bca73e8895e [2025-09-09 09:07:41,843][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/TaskRunner.py:215][log_analysis_only] copy status: 0 [2025-09-09 09:07:41,844][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/TaskRunner.py:214][log_analysis_only] !copying from /tmp/Hubble/LogkitWeb/AnalyseTask_web2EMUI/fa414108-60e0-4574-af3b-7bca73e8895e/fa414108-60e0-4574-af3b-7bca73e8895e.log to /opt/samba3/20250909/16/fa414108-60e0-4574-af3b-7bca73e8895e [2025-09-09 09:07:41,875][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/TaskRunner.py:215][log_analysis_only] copy status: 0 [2025-09-09 09:07:41,876][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/TaskRunner.py:217][log_analysis_only] Not A File: /tmp/Hubble/LogkitWeb/AnalyseTask_web2EMUI/fa414108-60e0-4574-af3b-7bca73e8895e/hilog.330.20250815-230805 (2) [2025-09-09 09:07:41,876][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/TaskRunner.py:214][log_analysis_only] !copying from /tmp/Hubble/LogkitWeb/AnalyseTask_web2EMUI/fa414108-60e0-4574-af3b-7bca73e8895e/perf_data.db to /opt/samba3/20250909/16/fa414108-60e0-4574-af3b-7bca73e8895e [2025-09-09 09:07:41,909][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/TaskRunner.py:215][log_analysis_only] copy status: 0 [2025-09-09 09:07:41,909][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/TaskRunner.py:214][log_analysis_only] !copying from /tmp/Hubble/LogkitWeb/AnalyseTask_web2EMUI/fa414108-60e0-4574-af3b-7bca73e8895e/task_info.json to /opt/samba3/20250909/16/fa414108-60e0-4574-af3b-7bca73e8895e [2025-09-09 09:07:41,931][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/TaskRunner.py:215][log_analysis_only] copy status: 0 [2025-09-09 09:07:41,931][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/TaskRunner.py:214][log_analysis_only] !copying from /tmp/Hubble/LogkitWeb/AnalyseTask_web2EMUI/fa414108-60e0-4574-af3b-7bca73e8895e/Graphics.json to /opt/samba3/20250909/16/fa414108-60e0-4574-af3b-7bca73e8895e [2025-09-09 09:07:41,949][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/TaskRunner.py:215][log_analysis_only] copy status: 0 [2025-09-09 09:07:41,949][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/TaskRunner.py:217][log_analysis_only] Not A File: /tmp/Hubble/LogkitWeb/AnalyseTask_web2EMUI/fa414108-60e0-4574-af3b-7bca73e8895e/user_defined_files [2025-09-09 09:07:41,950][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/TaskRunner.py:214][log_analysis_only] !copying from /tmp/Hubble/LogkitWeb/AnalyseTask_web2EMUI/fa414108-60e0-4574-af3b-7bca73e8895e/faultdiag_plugins.json to /opt/samba3/20250909/16/fa414108-60e0-4574-af3b-7bca73e8895e [2025-09-09 09:07:41,988][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/TaskRunner.py:215][log_analysis_only] copy status: 0 [2025-09-09 09:07:42,010][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/analysis/LogAnalyst.py:203][analysis_end] 已拷贝: [/opt/samba3/20250909/16/fa414108-60e0-4574-af3b-7bca73e8895e/faultdiag_plugins.json] [2025-09-09 09:07:42,010][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/HiviewKafka.py:179][send_msg_to_web] 发送Kafka消息的Topic: [AnalyseTask_EMUI2web], msg_dict: [{'quesid': 'fa414108-60e0-4574-af3b-7bca73e8895e', 'resultpath': '/opt/samba3/20250909/16/fa414108-60e0-4574-af3b-7bca73e8895e', 'result': 102, 'updatefiles': '', 'error_detail': '', 'processflag': ''}] [2025-09-09 09:07:42,331] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/LogExtractionRecordUtils.py] [line:169] [insert_analyse_runtime_data] 开始往数据库发送 2 数据 [2025-09-09 09:07:42,663] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/LogExtractionRecordUtils.py] [line:177] [insert_analyse_runtime_data] 发送 【2】类型 数据量:1 [2025-09-09 09:07:42,664][WARNING][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/TaskRunner.py:104][run] 执行任务: [COPY_USER_DEFINED_FILES] [2025-09-09 09:07:42,823][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/TaskRunner.py:183][copy_user_defined_files] start user_defined_files [2025-09-09 09:07:42,865][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/TaskRunner.py:194][copy_user_defined_files] copy status: 0 [2025-09-09 09:07:42,865][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/TaskRunner.py:197][copy_user_defined_files] end user_defined_files [2025-09-09 09:07:42,866] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/LogExtractionRecordUtils.py] [line:169] [insert_analyse_runtime_data] 开始往数据库发送 2 数据 [2025-09-09 09:07:43,189] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/LogExtractionRecordUtils.py] [line:177] [insert_analyse_runtime_data] 发送 【2】类型 数据量:1 [2025-09-09 09:07:43,190][WARNING][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/TaskRunner.py:104][run] 执行任务: [RESULT_PROCES_COMMONUTILS] [2025-09-09 09:07:43,344][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/TaskRunner.py:124][result_proces_commonutils] start RESULT_PROCES_COMMONUTILS [2025-09-09 09:07:43,346] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/scripts/plugins/plugin_tools/ResultProcesCommonUtils/ResultProcesCommonUtils.py] [line:28] [start_analysis] ResultProcesCommonUtils start_analysis ... [2025-09-09 09:07:43,346] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/scripts/plugins/plugin_tools/ResultProcesCommonUtils/ResultProcesCommonUtils.py] [line:54] [extraction_perf_db] ResultProcesCommonUtils get sys db file ... [2025-09-09 09:07:43,346] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/scripts/plugins/plugin_tools/ResultProcesCommonUtils/ResultProcesCommonUtils.py] [line:62] [extraction_perf_db] ResultProcesCommonUtils query prio info ... [2025-09-09 09:07:43,346] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/scripts/plugins/plugin_tools/ResultProcesCommonUtils/ResultProcesCommonUtils.py] [line:71] [extraction_perf_db] ResultProcesCommonUtils extraction data size:0 [2025-09-09 09:07:43,347] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/scripts/plugins/plugin_tools/ResultProcesCommonUtils/ResultProcesCommonUtils.py] [line:91] [betaclub_fault_characteristics] ResultProcesCommonUtils betaclub_fault_characteristics [2025-09-09 09:07:43,347] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/scripts/plugins/plugin_tools/ResultProcesCommonUtils/ResultProcesCommonUtils.py] [line:100] [betaclub_fault_characteristics] ResultProcesCommonUtils betaclub_fault_characteristics characteristics_result is null [2025-09-09 09:07:43,347][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/waiter/TaskRunner.py:129][result_proces_commonutils] end RESULT_PROCES_COMMONUTILS [2025-09-09 09:07:43,347] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/LogExtractionRecordUtils.py] [line:169] [insert_analyse_runtime_data] 开始往数据库发送 2 数据 [2025-09-09 09:07:43,666] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/LogExtractionRecordUtils.py] [line:177] [insert_analyse_runtime_data] 发送 【2】类型 数据量:1 [2025-09-09 09:07:43,666][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/manager/LogWaiterManager.py:120][run_task_queue] all tasks finish [2025-09-09 09:07:43,667][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/HiviewPlatform.py:232][run] 分析任务主题运行结束,进入收尾工作。 [2025-09-09 09:07:43,667][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/HiviewPlatform.py:48][wrapper] 当前函数 [run] 运行时间 5.73 秒 [2025-09-09 09:07:43,826] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/LogExtractionRecordUtils.py] [line:169] [insert_analyse_runtime_data] 开始往数据库发送 1 数据 [2025-09-09 09:07:44,186] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/LogExtractionRecordUtils.py] [line:177] [insert_analyse_runtime_data] 发送 【1】类型 数据量:1 [2025-09-09 09:07:44,187][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/HiviewPlatform.py:125][start_analysis] ---------------------------------------------- [2025-09-09 09:07:44,187][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/HiviewPlatform.py:126][start_analysis] 插件运行耗时:6.246186971664429 [2025-09-09 09:07:44,188][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/HiviewPlatform.py:127][start_analysis] ---------------------------------------------- [2025-09-09 09:07:44,189] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/logistics/ResultAggregator.py] [line:57] [result_aggregation] {} [2025-09-09 09:07:44,189] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/logistics/ResultAggregator.py] [line:65] [result_aggregation] 执行插件[eqr7plLp_stability_script_third_NEXT]的结果汇总功能 [2025-09-09 09:07:44,189] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/scripts/plugins/plugin_test/eqr7plLp_stability_script_third_NEXT.py] [line:371] [plugin_result_refinement] 这是分析结果精加工函数 [2025-09-09 09:07:44,190] [ERROR] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/logistics/ResultAggregator.py] [line:34] [get_user_defined_device] 设备类型无法从homepage_info中获取 'only_run_device' [2025-09-09 09:07:44,192][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/HiviewPlatform.py:48][wrapper] 当前函数 [run_after] 运行时间 0.00 秒 [2025-09-09 09:07:44,350] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/LogExtractionRecordUtils.py] [line:169] [insert_analyse_runtime_data] 开始往数据库发送 1 数据 [2025-09-09 09:07:44,676] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/LogExtractionRecordUtils.py] [line:177] [insert_analyse_runtime_data] 发送 【1】类型 数据量:1 [2025-09-09 09:07:44,678] [ERROR] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/logistics/LogReporter.py] [line:39] [generate_report] 分析结果存入报告生成数据库失败[[Errno 2] No such file or directory: '/tmp/Hubble/LogkitWeb/AnalyseTask_web2EMUI/fa414108-60e0-4574-af3b-7bca73e8895e/duplicate.json'] [2025-09-09 09:07:44,679] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/logistics/ResultSaver.py] [line:35] [save_final_results] Plugins结果[faultdiag_plugins.json]保存成功 [2025-09-09 09:07:44,679] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/logistics/ResultSaver.py] [line:103] [plugin_result2loginfo] 插件没有一级定界结果 [2025-09-09 09:07:44,679] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/core/service/logistics/ResultSaver.py] [line:139] [refill_loginfo] Refill loginfo OK. [2025-09-09 09:07:44,680][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/HiviewPlatform.py:247][save] 结果保存成功!(非保存,只做其他修改) [2025-09-09 09:07:44,680][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/HiviewPlatform.py:48][wrapper] 当前函数 [save] 运行时间 0.00 秒 [2025-09-09 09:07:44,839] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/LogExtractionRecordUtils.py] [line:169] [insert_analyse_runtime_data] 开始往数据库发送 1 数据 [2025-09-09 09:07:45,170] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/LogExtractionRecordUtils.py] [line:177] [insert_analyse_runtime_data] 发送 【1】类型 数据量:1 [2025-09-09 09:07:45,171][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/HiviewPlatform.py:130][start_analysis] 此次日志分析任务结束。 [2025-09-09 09:07:45,171][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/HiviewPlatform.py:131][start_analysis] 结果路径是: None [2025-09-09 09:07:45,172][INFO][1:137022433920704][/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/HiviewPlatform.py:48][wrapper] 当前函数 [start_analysis] 运行时间 11.72 秒 [2025-09-09 09:07:45,329] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/LogExtractionRecordUtils.py] [line:169] [insert_analyse_runtime_data] 开始往数据库发送 0 数据 [2025-09-09 09:07:45,651] [INFO] [/usr/src/app/aresdeepmindhubble-6.0/aresdeepmindhubble/Hubble/utils/basic/LogExtractionRecordUtils.py] [line:177] [insert_analyse_runtime_data] 发送 【0】类型 数据量:1
09-10
onDestroy: MainActivity 2025-09-02 19:46:49.064 11143-11143 WindowOnBackDispatcher com.raoshenglin.music050816 W sendCancelIfRunning: isInProgress=false callback=android.view.ViewRootImpl$$ExternalSyntheticLambda12@1cf3322 2025-09-02 19:46:49.065 11143-11143 DecorViewImmersiveImpl com.raoshenglin.music050816 D onDetachedFromWindow 2025-09-02 19:46:49.250 11143-11143 MediaPlayer com.raoshenglin.music050816 V resetDrmState: mDrmInfo=null mDrmProvisioningThread=null mPrepareDrmInProgress=false mActiveDrmScheme=false 2025-09-02 19:46:49.250 11143-11143 MediaPlayer com.raoshenglin.music050816 V cleanDrmObj: mDrmObj=null mDrmSessionId=null 2025-09-02 19:46:49.256 11143-11143 FileUtils com.raoshenglin.music050816 E err write to mi_exception_log 2025-09-02 19:46:49.444 11143-25502 lin.music050816 com.raoshenglin.music050816 I This is non sticky GC, maxfree is 33554432 minfree is 8388608 2025-09-02 19:46:50.229 11143-25502 lin.music050816 com.raoshenglin.music050816 I This is sticky GC, maxfree is 33554432 minfree is 8388608 2025-09-02 19:46:50.822 11143-25502 lin.music050816 com.raoshenglin.music050816 I This is non sticky GC, maxfree is 33554432 minfree is 8388608 2025-09-02 19:46:51.385 11143-25502 lin.music050816 com.raoshenglin.music050816 I This is sticky GC, maxfree is 33554432 minfree is 8388608 2025-09-02 19:46:51.858 11143-25502 lin.music050816 com.raoshenglin.music050816 I This is non sticky GC, maxfree is 33554432 minfree is 8388608 2025-09-02 19:46:52.291 1855-1855 qspmHal vendor.qti.qspmhal-service E setAppInfoH atPid = 11143, gpuFname:com.raoshenglin.music050816, gpuFver:10
09-03
2025-09-18 09:47:38.200 [reactor-http-nio-2] DEBUG io.netty.buffer.AbstractByteBuf - -Dio.netty.buffer.checkAccessible: true 2025-09-18 09:47:38.200 [reactor-http-nio-2] DEBUG io.netty.buffer.AbstractByteBuf - -Dio.netty.buffer.checkBounds: true 2025-09-18 09:47:38.200 [reactor-http-nio-2] DEBUG io.netty.util.ResourceLeakDetectorFactory - Loaded default ResourceLeakDetector: io.netty.util.ResourceLeakDetector@7e881c36 2025-09-18 09:47:38.243 [reactor-http-nio-2] DEBUG reactor.netty.http.server.HttpServerOperations - [c658418b, L:/192.168.10.122:9000 - R:/192.168.10.221:63369] New http connection, requesting read 2025-09-18 09:47:38.243 [reactor-http-nio-2] DEBUG reactor.netty.transport.TransportConfig - [c658418b, L:/192.168.10.122:9000 - R:/192.168.10.221:63369] Initialized pipeline DefaultChannelPipeline{(reactor.left.httpCodec = io.netty.handler.codec.http.HttpServerCodec), (reactor.left.httpTrafficHandler = reactor.netty.http.server.HttpTrafficHandler), (reactor.right.reactiveBridge = reactor.netty.channel.ChannelOperationsHandler)} 2025-09-18 09:47:38.251 [reactor-http-nio-2] DEBUG io.netty.util.Recycler - -Dio.netty.recycler.maxCapacityPerThread: 4096 2025-09-18 09:47:38.251 [reactor-http-nio-2] DEBUG io.netty.util.Recycler - -Dio.netty.recycler.ratio: 8 2025-09-18 09:47:38.251 [reactor-http-nio-2] DEBUG io.netty.util.Recycler - -Dio.netty.recycler.chunkSize: 32 2025-09-18 09:47:38.251 [reactor-http-nio-2] DEBUG io.netty.util.Recycler - -Dio.netty.recycler.blocking: false 2025-09-18 09:47:38.251 [reactor-http-nio-2] DEBUG io.netty.util.Recycler - -Dio.netty.recycler.batchFastThreadLocalOnly: true 2025-09-18 09:47:38.288 [reactor-http-nio-2] DEBUG reactor.netty.http.server.HttpServerOperations - [c658418b, L:/192.168.10.122:9000 - R:/192.168.10.221:63369] Increasing pending responses, now 1 2025-09-18 09:47:38.299 [reactor-http-nio-2] DEBUG reactor.netty.http.server.HttpServer - [c658418b-1, L:/192.168.10.122:9000 - R:/192.168.10.221:63369] Handler is being applied: org.springframework.http.server.reactive.ReactorHttpHandlerAdapter@721f3cd7 2025-09-18 09:47:38.337 [reactor-http-nio-2] DEBUG o.s.web.server.adapter.HttpWebHandlerAdapter - [c658418b-1] HTTP POST "/master/transportTime/add" 2025-09-18 09:47:38.364 [reactor-http-nio-2] DEBUG o.s.w.r.r.m.a.RequestMappingHandlerMapping - [c658418b-1] Mapped to com.hvlink.controller.TransportTimeController#insert(TransportTimeDTO) 2025-09-18 09:47:38.381 [reactor-http-nio-2] DEBUG o.s.w.r.r.m.a.RequestBodyMethodArgumentResolver - [c658418b-1] Content-Type:application/json 2025-09-18 09:47:38.391 [reactor-http-nio-2] DEBUG o.s.w.r.r.m.a.RequestBodyMethodArgumentResolver - [c658418b-1] 0..1 [com.hvlink.entity.dto.TransportTimeDTO] 2025-09-18 09:47:38.411 [reactor-http-nio-2] DEBUG reactor.netty.channel.FluxReceive - [c658418b-1, L:/192.168.10.122:9000 - R:/192.168.10.221:63369] [terminated=false, cancelled=false, pending=0, error=null]: subscribing inbound receiver 2025-09-18 09:47:38.431 [reactor-http-nio-2] DEBUG o.s.http.codec.json.Jackson2JsonDecoder - [c658418b-1] Decoded [TransportTimeDTO(id=null, companyCode=null, supplierCode=null, factoryCode=null, warehouseCode=null, (truncated)...] 2025-09-18 09:47:38.449 [reactor-http-nio-2] DEBUG o.s.jdbc.datasource.DataSourceTransactionManager - Creating new transaction with name [com.hvlink.service.impl.TransportTimeServiceImpl.insert]: PROPAGATION_REQUIRED,ISOLATION_DEFAULT 2025-09-18 09:47:38.645 [reactor-http-nio-2] INFO com.alibaba.druid.pool.DruidDataSource - {dataSource-1} inited 2025-09-18 09:47:39.108 [reactor-http-nio-2] DEBUG o.s.jdbc.datasource.DataSourceTransactionManager - Acquired Connection [ConnectionID:1 ClientConnectionId: 69f1048c-e727-4578-a768-9bd8edc01bc9] for JDBC transaction 2025-09-18 09:47:39.112 [reactor-http-nio-2] DEBUG o.s.jdbc.datasource.DataSourceTransactionManager - Switching JDBC Connection [ConnectionID:1 ClientConnectionId: 69f1048c-e727-4578-a768-9bd8edc01bc9] to manual commit 2025-09-18 09:47:39.140 [reactor-http-nio-2] DEBUG org.mybatis.spring.SqlSessionUtils - Creating a new SqlSession 2025-09-18 09:47:39.152 [reactor-http-nio-2] DEBUG org.mybatis.spring.SqlSessionUtils - Registering transaction synchronization for SqlSession [org.apache.ibatis.session.defaults.DefaultSqlSession@17d2ae20] 2025-09-18 09:47:39.189 [reactor-http-nio-2] DEBUG o.m.spring.transaction.SpringManagedTransaction - JDBC Connection [ConnectionID:1 ClientConnectionId: 69f1048c-e727-4578-a768-9bd8edc01bc9] will be managed by Spring 2025-09-18 09:47:39.192 [reactor-http-nio-2] DEBUG c.h.m.master.TransportTimeMapper.existsByCodes - ==> Preparing: SELECT CASE WHEN EXISTS ( SELECT 1 FROM tm_transport_time ) THEN 1 ELSE 0 END 2025-09-18 09:47:39.223 [reactor-http-nio-2] DEBUG c.h.m.master.TransportTimeMapper.existsByCodes - ==> Parameters: 2025-09-18 09:47:39.260 [reactor-http-nio-2] DEBUG c.h.m.master.TransportTimeMapper.existsByCodes - <== Total: 1 2025-09-18 09:47:39.272 [reactor-http-nio-2] DEBUG org.mybatis.spring.SqlSessionUtils - Releasing transactional SqlSession [org.apache.ibatis.session.defaults.DefaultSqlSession@17d2ae20] 2025-09-18 09:47:39.303 [reactor-http-nio-2] DEBUG org.mybatis.spring.SqlSessionUtils - Fetched SqlSession [org.apache.ibatis.session.defaults.DefaultSqlSession@17d2ae20] from current transaction 2025-09-18 09:47:39.312 [reactor-http-nio-2] DEBUG c.hvlink.mapper.master.TransportTimeMapper.insert - ==> Preparing: INSERT INTO tm_transport_time ( transport_time, is_deleted, create_by, create_time, update_by, update_time ) VALUES ( ?, ?, ?, ?, ?, ? ) 2025-09-18 09:47:39.338 [reactor-http-nio-2] DEBUG c.hvlink.mapper.master.TransportTimeMapper.insert - ==> Parameters: 49.0(Double), false(Boolean), admin(String), 2025-09-18 09:47:39.301(Timestamp), admin(String), 2025-09-18 09:47:39.301(Timestamp) 2025-09-18 09:47:39.374 [reactor-http-nio-2] DEBUG org.mybatis.spring.SqlSessionUtils - Releasing transactional SqlSession [org.apache.ibatis.session.defaults.DefaultSqlSession@17d2ae20] 2025-09-18 09:47:39.462 [reactor-http-nio-2] DEBUG o.s.beans.factory.xml.XmlBeanDefinitionReader - Loaded 11 bean definitions from class path resource [org/springframework/jdbc/support/sql-error-codes.xml] 2025-09-18 09:47:39.463 [reactor-http-nio-2] DEBUG o.s.b.factory.support.DefaultListableBeanFactory - Creating shared instance of singleton bean 'DB2' 2025-09-18 09:47:39.467 [reactor-http-nio-2] DEBUG o.s.b.factory.support.DefaultListableBeanFactory - Creating shared instance of singleton bean 'Derby' 2025-09-18 09:47:39.467 [reactor-http-nio-2] DEBUG o.s.b.factory.support.DefaultListableBeanFactory - Creating shared instance of singleton bean 'H2' 2025-09-18 09:47:39.468 [reactor-http-nio-2] DEBUG o.s.b.factory.support.DefaultListableBeanFactory - Creating shared instance of singleton bean 'HDB' 2025-09-18 09:47:39.469 [reactor-http-nio-2] DEBUG o.s.b.factory.support.DefaultListableBeanFactory - Creating shared instance of singleton bean 'HSQL' 2025-09-18 09:47:39.470 [reactor-http-nio-2] DEBUG o.s.b.factory.support.DefaultListableBeanFactory - Creating shared instance of singleton bean 'Informix' 2025-09-18 09:47:39.470 [reactor-http-nio-2] DEBUG o.s.b.factory.support.DefaultListableBeanFactory - Creating shared instance of singleton bean 'MS-SQL' 2025-09-18 09:47:39.470 [reactor-http-nio-2] DEBUG o.s.b.factory.support.DefaultListableBeanFactory - Creating shared instance of singleton bean 'MySQL' 2025-09-18 09:47:39.471 [reactor-http-nio-2] DEBUG o.s.b.factory.support.DefaultListableBeanFactory - Creating shared instance of singleton bean 'Oracle' 2025-09-18 09:47:39.471 [reactor-http-nio-2] DEBUG o.s.b.factory.support.DefaultListableBeanFactory - Creating shared instance of singleton bean 'PostgreSQL' 2025-09-18 09:47:39.472 [reactor-http-nio-2] DEBUG o.s.b.factory.support.DefaultListableBeanFactory - Creating shared instance of singleton bean 'Sybase' 2025-09-18 09:47:39.472 [reactor-http-nio-2] DEBUG o.s.jdbc.support.SQLErrorCodesFactory - Looking up default SQLErrorCodes for DataSource [com.hvlink.config.DataSourceConfig$1@23ee2ccf] 2025-09-18 09:47:39.484 [reactor-http-nio-2] DEBUG o.s.jdbc.support.SQLErrorCodesFactory - SQL error codes for 'Microsoft SQL Server' found 2025-09-18 09:47:39.484 [reactor-http-nio-2] DEBUG o.s.jdbc.support.SQLErrorCodesFactory - Caching SQL error codes for DataSource [com.hvlink.config.DataSourceConfig$1@23ee2ccf]: database product name is 'Microsoft SQL Server' 2025-09-18 09:47:39.486 [reactor-http-nio-2] DEBUG o.s.j.support.SQLErrorCodeSQLExceptionTranslator - Unable to translate SQLException with Error code '515', will now try the fallback translator 2025-09-18 09:47:39.486 [reactor-http-nio-2] DEBUG o.s.jdbc.support.SQLStateSQLExceptionTranslator - Extracted SQL state class '23' from value '23000' 2025-09-18 09:47:39.488 [reactor-http-nio-2] DEBUG org.mybatis.spring.SqlSessionUtils - Transaction synchronization deregistering SqlSession [org.apache.ibatis.session.defaults.DefaultSqlSession@17d2ae20] 2025-09-18 09:47:39.489 [reactor-http-nio-2] DEBUG org.mybatis.spring.SqlSessionUtils - Transaction synchronization closing SqlSession [org.apache.ibatis.session.defaults.DefaultSqlSession@17d2ae20] 2025-09-18 09:47:39.490 [reactor-http-nio-2] DEBUG o.s.jdbc.datasource.DataSourceTransactionManager - Initiating transaction rollback 2025-09-18 09:47:39.490 [reactor-http-nio-2] DEBUG o.s.jdbc.datasource.DataSourceTransactionManager - Rolling back JDBC transaction on Connection [ConnectionID:1 ClientConnectionId: 69f1048c-e727-4578-a768-9bd8edc01bc9] 2025-09-18 09:47:39.515 [reactor-http-nio-2] DEBUG o.s.jdbc.datasource.DataSourceTransactionManager - Releasing JDBC Connection [ConnectionID:1 ClientConnectionId: 69f1048c-e727-4578-a768-9bd8edc01bc9] after transaction 2025-09-18 09:47:39.523 [reactor-http-nio-2] DEBUG o.s.w.r.r.m.annotation.ResponseBodyResultHandler - [c658418b-1] Using 'application/json' given [application/json, text/plain, */*] and supported [application/json, application/*+json, application/x-ndjson, text/event-stream] 2025-09-18 09:47:39.524 [reactor-http-nio-2] DEBUG o.s.w.r.r.m.annotation.ResponseBodyResultHandler - [c658418b-1] 0..1 [com.hvlink.common.Result<java.lang.String>] 2025-09-18 09:47:39.533 [reactor-http-nio-2] DEBUG o.s.http.codec.json.Jackson2JsonEncoder - [c658418b-1] Encoding [Result(status=500, msg=<EOL><EOL>### Error updating database. Cause: com.microsoft.sqlserver.jdbc.SQLServer (truncated)...] 2025-09-18 09:47:39.564 [reactor-http-nio-2] DEBUG reactor.netty.http.server.HttpServerOperations - [c658418b-1, L:/192.168.10.122:9000 - R:/192.168.10.221:63369] Detected non persistent http connection, preparing to close 2025-09-18 09:47:39.599 [reactor-http-nio-2] DEBUG reactor.netty.http.server.HttpServerOperations - [c658418b-1, L:/192.168.10.122:9000 - R:/192.168.10.221:63369] Last HTTP packet was sent, terminating the channel 2025-09-18 09:47:39.599 [reactor-http-nio-2] DEBUG reactor.netty.channel.ChannelOperations - [c658418b-1, L:/192.168.10.122:9000 - R:/192.168.10.221:63369] [HttpServer] Channel inbound receiver cancelled (operation cancelled). 2025-09-18 09:47:39.634 [reactor-http-nio-2] DEBUG o.s.web.server.adapter.HttpWebHandlerAdapter - [c658418b-1] Completed 200 OK
最新发布
09-19
评论
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值