2012-11-02 10:55 ADS错误the session file 'C:\user\username\default-1-2-0-0.ses' could not be loaded解决办

本文详细介绍了在使用ADS1.2+H-JTAG/H-Jlink进行工程编译并通过AXD进行调试时遇到的错误‘thesessionfilecouldnotbeloaded’的解决方法。包括将工程保存到英文或数字目录下,以及修改调试目标界面配置中Save and load default session file选项的步骤,以避免错误窗口的弹出。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

 

 

问题描述:用ADS1.2 + H-JTAG或者是H-Jlink,每次调试的时候都会出现“the session file could not be loaded”这个错误,寻求解决办法?
问题解答用户创建的工程编译通过后,进入AXD调试环境时,会弹出如下错误窗口。

解决方案

此问题,有两种解决方法:

1、移动工程

将工程存放在英文或数字(不包含中文)目录下此方法操作最简单

PS:建议:以后凡是涉及到编程的东西都必须用英文字符,绝对禁止用中文字符,以免引起所编程序跟所用工具的不兼容问题

2、修改调试目标的界面配置

通过点击Opentions(这里不论是H-JTAG还是H-Jlink------>Configure Interface(界面配置)选项------->General(通用配置)------>将Save and load default session file选项勾掉,以避免在调试时弹出错误窗口。此操作不需要修改工程的当前存放路径。


以上两种方法经验证,均可行。


 

[2025-06-13 15:55:04.975] [INFO] [ForkJoinPool-2-worker-114] [i.t.transporter.web.schedule.SolutionWatcher] >>> [tdt] msg=change sub job status to RUNNING, sub job id is ac8d6452-8534-4d21-942f-502544fc307a [2025-06-13 15:55:04.981] [INFO] [ForkJoinPool-2-worker-114] [i.t.transporter.web.schedule.SolutionWatcher] >>> [tdt] msg=start running subjob ,subjob id is: ac8d6452-8534-4d21-942f-502544fc307a [2025-06-13 15:55:04.992] [INFO] [ForkJoinPool-2-worker-114] [i.t.t.web.schedule.DataImportDylanConfManager] >>> [tdt] msg=start init solution configuration information >>>>>>> [2025-06-13 15:55:05.029] [INFO] [ForkJoinPool-2-worker-114] [i.t.t.web.schedule.DataImportDylanConfManager] >>> [tdt] msg=Local drive directory: /usr/lib/tdt/connector/driver/mysql/mysql-8.0.17 [2025-06-13 15:55:05.079] [INFO] [ForkJoinPool-2-worker-114] [i.t.t.web.schedule.DataImportDylanConfManager] >>> [tdt] msg=Check drive filemysql-connector-java-8.0.17.jar whether it exists on hdfs:05c0833161294679bc1425500af6e2b3 [2025-06-13 15:55:05.102] [INFO] [ForkJoinPool-2-worker-114] [i.t.t.web.schedule.DataImportDylanConfManager] >>> [tdt] msg=Driver file already exists, skip uploading. [2025-06-13 15:55:07.710] [INFO] [ForkJoinPool-2-worker-114] [i.t.t.web.schedule.DataImportDylanConfManager] >>> [tdt] msg=Truncate target table: Connection: [HikariProxyConnection@935173812 wrapping com.mysql.cj.jdbc.ConnectionImpl@7500845c] execute sql: [ TRUNCATE TABLE `ncbd_parking`.`smart_park_access_record` ] [2025-06-13 15:55:07.720] [INFO] [ForkJoinPool-2-worker-114] [i.t.t.web.schedule.DataImportDylanConfManager] >>> [tdt] msg=<<<<<<< finish to init solution config information [2025-06-13 15:55:07.732] [INFO] [ForkJoinPool-2-worker-114] [io.transwarp.transporter.dylan.core.job.DylanJob] >>> [tdt] msg=Job [ac8d6452-8534-4d21-942f-502544fc307a] init finish [2025-06-13 15:55:07.739] [INFO] [ForkJoinPool-2-worker-114] [io.transwarp.transporter.dylan.core.facade.Job] >>> [tdt] msg=Job [ac8d6452-8534-4d21-942f-502544fc307a] start execution [2025-06-13 15:55:07.741] [INFO] [ForkJoinPool-2-worker-114] [i.t.t.dylan.core.task.reader.InceptorReader] >>> [tdt] msg=Inceptor Reader exec >>>>>>> [2025-06-13 15:55:07.775] [INFO] [ForkJoinPool-2-worker-114] [i.t.t.dylan.core.task.reader.InceptorReader] >>> [tdt] msg=source Inceptor,create view:default.TDT_ROCK_VIEW_cb7c653f301c471fb0a8bedafa1b3bde: [2025-06-13 15:55:07.782] [INFO] [ForkJoinPool-2-worker-114] [i.t.transporter.web.schedule.SolutionWatcher] >>> [tdt] msg=start recording temporary table{"id":969294,"uniqueId":"efe73223-d50c-4c89-906b-3bc93e59658b","solExecUuid":"c34fa2f1-a103-4350-b3a3-7a0e20cfc56e","engineUuid":"f8f534cd176b4290893e0137b7eedfc4","viewName":"TDT_ROCK_VIEW_cb7c653f301c471fb0a8bedafa1b3bde","isTable":false,"isFinished":false,"createTime":null} [2025-06-13 15:55:07.786] [INFO] [ForkJoinPool-2-worker-114] [io.transwarp.transporter.dylan.core.util.HiveUtil] >>> [tdt] msg=Connection: [HikariProxyConnection@1665649615 wrapping org.apache.hive.jdbc.HiveConnection@e915169] Execute sql: [ CREATE VIEW `default`.`TDT_ROCK_VIEW_cb7c653f301c471fb0a8bedafa1b3bde` ( `id`,`out_park_no`,`park_no`,`park_name`,`area_code`,`park_type`,`structural_type`,`plate_no`,`plate_color`,`access_park_no`,`access_type`,`entry_time`,`leave_time`,`parking_time`,`record_time`,`image_type`,`image_url`,`image_data`,`create_time`,`update_time` ) AS SELECT `id`, `out_park_no`, `park_no`, `park_name`, `area_code`, `park_type`, `structural_type`, `plate_no`, `plate_color`, `access_park_no`, `access_type`, `entry_time`, `leave_time`, `parking_time`, `record_time`, `image_type`, `image_url`, `image_data`, `create_time`, `update_time` FROM `smart_parking_ads`.`parking_entry_exit_info_ads` where 1 = 1 ] [2025-06-13 15:55:07.894] [INFO] [ForkJoinPool-2-worker-114] [io.transwarp.transporter.dylan.core.util.HiveUtil] >>> [tdt] msg=Execute stats: [execution time: 0 second] [2025-06-13 15:55:07.910] [INFO] [ForkJoinPool-2-worker-114] [i.t.t.dylan.core.task.reader.InceptorReader] >>> [tdt] msg=<<<<<< Inceptor Reader finish [2025-06-13 15:55:07.916] [INFO] [ForkJoinPool-2-worker-114] [i.t.transporter.web.schedule.SolutionWatcher] >>> [tdt] msg=start recording temporary table{"id":969295,"uniqueId":"67389e07-3166-4595-bcc3-106af2f41ccd","solExecUuid":"c34fa2f1-a103-4350-b3a3-7a0e20cfc56e","engineUuid":"c18e5da591fa4fb182503edf3093cd49","viewName":"TDT_ROCK_VIEW_fb254665d5cf4d96bd84421009e85928","isTable":false,"isFinished":false,"createTime":null} [2025-06-13 15:55:07.919] [INFO] [ForkJoinPool-2-worker-114] [io.transwarp.transporter.dylan.core.util.HiveUtil] >>> [tdt] msg=Connection: [HikariProxyConnection@1772745478 wrapping org.apache.hive.jdbc.HiveConnection@e915169] Execute sql: [ CREATE VIEW `default`.`TDT_ROCK_VIEW_fb254665d5cf4d96bd84421009e85928` ( `id`,`out_park_no`,`park_no`,`park_name`,`area_code`,`park_type`,`structural_type`,`plate_no`,`plate_color`,`access_park_no`,`access_type`,`entry_time`,`leave_time`,`parking_time`,`record_time`,`image_type`,`image_url`,`image_data`,`create_time`,`update_time` ) AS SELECT `id`, `out_park_no`, `park_no`, `park_name`, `area_code`, `park_type`, `structural_type`, `plate_no`, `plate_color`, `access_park_no`, `access_type`, `entry_time`, `leave_time`, `parking_time`, `record_time`, `image_type`, `image_url`, `image_data`, `create_time`, `update_time` FROM `default`.`TDT_ROCK_VIEW_cb7c653f301c471fb0a8bedafa1b3bde` ] [2025-06-13 15:55:08.037] [INFO] [ForkJoinPool-2-worker-114] [io.transwarp.transporter.dylan.core.util.HiveUtil] >>> [tdt] msg=Execute stats: [execution time: 0 second] [2025-06-13 15:55:08.042] [INFO] [ForkJoinPool-2-worker-114] [i.t.transporter.dylan.core.task.writer.JDBCWriter] >>> [tdt] msg=JDBC Writer exec >>>>>>>> [2025-06-13 15:55:08.045] [INFO] [ForkJoinPool-2-worker-114] [i.t.transporter.dylan.core.task.writer.JDBCWriter] >>> [tdt] msg=source Inceptor,crete Ouput table:default.TDT_ROCK_TABLE_45aef1f3ac4847d396ca5d9f8050b2e4: [2025-06-13 15:55:08.050] [INFO] [ForkJoinPool-2-worker-114] [i.t.transporter.web.schedule.SolutionWatcher] >>> [tdt] msg=start recording temporary table{"id":969296,"uniqueId":"6d2b2909-d68b-4e63-816e-1b6872a2e091","solExecUuid":"c34fa2f1-a103-4350-b3a3-7a0e20cfc56e","engineUuid":"c18e5da591fa4fb182503edf3093cd49","viewName":"TDT_ROCK_TABLE_45aef1f3ac4847d396ca5d9f8050b2e4","isTable":true,"isFinished":false,"createTime":null} [2025-06-13 15:55:08.054] [INFO] [ForkJoinPool-2-worker-114] [io.transwarp.transporter.dylan.core.util.HiveUtil] >>> [tdt] msg=Connection: [HikariProxyConnection@2121723750 wrapping org.apache.hive.jdbc.HiveConnection@e915169] Execute sql: [ CREATE TABLE IF NOT EXISTS `default`.`TDT_ROCK_TABLE_45aef1f3ac4847d396ca5d9f8050b2e4` ( `id` int,`out_park_no` string,`park_no` string,`park_name` string,`area_code` string,`park_type` int,`structural_type` int,`plate_no` string,`plate_color` int,`access_park_no` string,`access_type` int,`entry_time` timestamp,`leave_time` timestamp,`parking_time` bigint,`record_time` timestamp,`image_type` int,`image_url` string,`image_data` blob,`create_time` timestamp,`update_time` timestamp ) ROW FORMAT SERDE 'org.apache.hadoop.hive.ql.io.tdt.JDBCSerde' STORED AS INPUTFORMAT 'org.apache.hadoop.hive.ql.io.tdt.JDBCDBInputFormat' OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.tdt.JDBCDBOutputFormat' TBLPROPERTIES ('mapreduce.jdbc.password'='******','mapreduce.jdbc.output.table.name'='`ncbd_parking`.`smart_park_access_record`','mapreduce.jdbc.output.field.names'='id,out_park_no,park_no,park_name,area_code,park_type,structural_type,plate_no,plate_color,access_park_no,access_type,entry_time,leave_time,parking_time,record_time,image_type,image_url,image_data,create_time,update_time,MULTI_TRANSACTION','mapreduce.jdbc.driver.class'='com.mysql.cj.jdbc.Driver','mapreduce.jdbc.driver.file'='/user/mysql-connector-java-8.0.17.jar','mapreduce.jdbc.username'='root','mapreduce.jdbc.url'='jdbc:mysql://172.20.99.38:3306/ncbd_parking') ] [2025-06-13 15:55:08.155] [INFO] [ForkJoinPool-2-worker-114] [io.transwarp.transporter.dylan.core.util.HiveUtil] >>> [tdt] msg=Execute stats: [execution time: 0 second] [2025-06-13 15:55:08.158] [INFO] [ForkJoinPool-2-worker-114] [i.t.transporter.dylan.core.task.writer.JDBCWriter] >>> [tdt] msg=source Inceptor start task, push data [2025-06-13 15:55:08.161] [INFO] [ForkJoinPool-2-worker-114] [io.transwarp.transporter.dylan.core.util.HiveUtil] >>> [tdt] msg=Connection: [HikariProxyConnection@2121723750 wrapping org.apache.hive.jdbc.HiveConnection@e915169] Execute sql: [ INSERT INTO ` default`.`TDT_ROCK_TABLE_45aef1f3ac4847d396ca5d9f8050b2e4` SELECT `id`, `out_park_no`, `park_no`, `park_name`, `area_code`, `park_type`, `structural_type`, `plate_no`, `plate_color`, `access_park_no`, `access_type`, `entry_time`, `leave_time`, `parking_time`, `record_time`, `image_type`, `image_url`, `image_data`, `create_time`, `update_time` from `default`.`TDT_ROCK_VIEW_fb254665d5cf4d96bd84421009e85928` ] [2025-06-13 16:12:17.489] [ERROR] [ForkJoinPool-2-worker-114] [io.transwarp.transporter.dylan.core.facade.Job] >>> [tdt] msg=Job [ac8d6452-8534-4d21-942f-502544fc307a] Failed to execute: io.transwarp.studio.foundation.common.execption.StudioException: errorCode=error code:[21000], description:[Failed to execute INCEPTOR SQL, please check the error message], arguments=[], resource=null at io.transwarp.transporter.common.TDTException.asTDTException(TDTException.java:36) at io.transwarp.transporter.dylan.core.task.writer.JDBCWriter.process(JDBCWriter.java:106) at io.transwarp.transporter.dylan.core.task.writer.JDBCWriter.process(JDBCWriter.java:24) at io.transwarp.transporter.dylan.core.util.JobUtil.lambda$pipe$1e73f91a$1(JobUtil.java:54) at com.spotify.flo.TaskBuilderImpl$Builder1.lambda$process$3f4d96e5$1(TaskBuilderImpl.java:105) at com.spotify.flo.EvalContext.lambda$null$1618adf9$1(EvalContext.java:100) at java.util.concurrent.CompletableFuture$AsyncSupply.run(CompletableFuture.java:1590) at com.spotify.flo.workflow.shaded.io.grpc.Context$1.run(Context.java:595) at io.transwarp.transporter.common.utils.TransporterExecutor.lambda$execute$8(TransporterExecutor.java:209) at java.util.concurrent.CompletableFuture.uniRun(CompletableFuture.java:705) at java.util.concurrent.CompletableFuture$UniRun.tryFire(CompletableFuture.java:687) at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:474) at java.util.concurrent.CompletableFuture$AsyncRun.run(CompletableFuture.java:1632) at java.util.concurrent.CompletableFuture$AsyncRun.exec(CompletableFuture.java:1618) at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289) at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056) at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692) at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157) Caused by: io.transwarp.studio.foundation.common.execption.StudioException: errorCode=error code:[21000], description:[Failed to execute INCEPTOR SQL, please check the error message], arguments=[Inceptor SQL Error: Connection: [HikariProxyConnection@2121723750 wrapping com.zaxxer.hikari.pool.ProxyConnection.ClosedConnection] Execute sql: [ INSERT INTO ` default`.`TDT_ROCK_TABLE_45aef1f3ac4847d396ca5d9f8050b2e4` SELECT `id`, `out_park_no`, `park_no`, `park_name`, `area_code`, `park_type`, `structural_type`, `plate_no`, `plate_color`, `access_park_no`, `access_type`, `entry_time`, `leave_time`, `parking_time`, `record_time`, `image_type`, `image_url`, `image_data`, `create_time`, `update_time` from `default`.`TDT_ROCK_VIEW_fb254665d5cf4d96bd84421009e85928` ]], resource=null at io.transwarp.transporter.common.TDTException.asTDTException(TDTException.java:32) at io.transwarp.transporter.dylan.core.util.HiveUtil.executeSQLs(HiveUtil.java:227) at io.transwarp.transporter.dylan.core.task.writer.JDBCWriter.process(JDBCWriter.java:104) ... 16 common frames omitted Caused by: java.lang.RuntimeException: java.sql.SQLException: EXECUTION FAILED: Task MAPRED-SPARK error SparkException: [Error 1] Job aborted due to stage failure: Task 2 in stage 2121035.0 failed 4 times, most recent failure: Lost task 2.3 in stage 2121035.0 (TID 65808615, 10.103.0.0, jobId 1725438, sqlId 2491408, sessionId 30982): org.apache.hadoop.hive.ql.metadata.HiveException: Got SQLException at io.transwarp.transporter.common.utils.StatsUtils.execute(StatsUtils.java:20) at io.transwarp.transporter.dylan.core.util.HiveUtil.executeSQLs(HiveUtil.java:217) ... 17 common frames omitted Caused by: java.sql.SQLException: EXECUTION FAILED: Task MAPRED-SPARK error SparkException: [Error 1] Job aborted due to stage failure: Task 2 in stage 2121035.0 failed 4 times, most recent failure: Lost task 2.3 in stage 2121035.0 (TID 65808615, 10.103.0.0, jobId 1725438, sqlId 2491408, sessionId 30982): org.apache.hadoop.hive.ql.metadata.HiveException: Got SQLException at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:250) at org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:234) at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:408) at com.zaxxer.hikari.pool.ProxyStatement.execute(ProxyStatement.java:94) at com.zaxxer.hikari.pool.HikariProxyStatement.execute(HikariProxyStatement.java) at io.transwarp.transporter.dylan.core.util.HiveUtil.lambda$executeSQLs$2(HiveUtil.java:218) at io.transwarp.transporter.common.utils.StatsUtils.execute(StatsUtils.java:15) ... 18 common frames omitted [2025-06-13 16:12:17.493] [INFO] [ForkJoinPool-2-worker-114] [io.transwarp.transporter.dylan.core.job.DylanJob] >>> [tdt] msg=Job [ac8d6452-8534-4d21-942f-502544fc307a] start cleaning [2025-06-13 16:12:17.532] [INFO] [ForkJoinPool-2-worker-114] [io.transwarp.transporter.dylan.core.util.HiveUtil] >>> [tdt] msg=Connection: [HikariProxyConnection@2094753899 wrapping org.apache.hive.jdbc.HiveConnection@504eb223] execute sql: [ DROP VIEW IF EXISTS `default`.`TDT_ROCK_VIEW_cb7c653f301c471fb0a8bedafa1b3bde` ] [2025-06-13 16:12:17.723] [INFO] [ForkJoinPool-2-worker-114] [i.t.transporter.web.schedule.SolutionWatcher] >>> [tdt] msg=Deleted record sheet{"tableName":"TDT_ROCK_VIEW_cb7c653f301c471fb0a8bedafa1b3bde","isView":true,"engineUuid":"f8f534cd176b4290893e0137b7eedfc4"} [2025-06-13 16:12:17.726] [INFO] [ForkJoinPool-2-worker-114] [io.transwarp.transporter.dylan.core.util.HiveUtil] >>> [tdt] msg=Connection: [HikariProxyConnection@1129038353 wrapping org.apache.hive.jdbc.HiveConnection@504eb223] execute sql: [ DROP VIEW IF EXISTS `default`.`TDT_ROCK_VIEW_fb254665d5cf4d96bd84421009e85928` ] [2025-06-13 16:12:17.842] [INFO] [ForkJoinPool-2-worker-114] [i.t.transporter.web.schedule.SolutionWatcher] >>> [tdt] msg=Deleted record sheet{"tableName":"TDT_ROCK_VIEW_fb254665d5cf4d96bd84421009e85928","isView":true,"engineUuid":null} [2025-06-13 16:12:17.847] [INFO] [ForkJoinPool-2-worker-114] [io.transwarp.transporter.dylan.core.util.HiveUtil] >>> [tdt] msg=Connection: [HikariProxyConnection@104963204 wrapping org.apache.hive.jdbc.HiveConnection@504eb223] execute sql: [ DROP TABLE IF EXISTS `default`.`TDT_ROCK_TABLE_45aef1f3ac4847d396ca5d9f8050b2e4` ] [2025-06-13 16:12:18.000] [INFO] [ForkJoinPool-2-worker-114] [i.t.transporter.web.schedule.SolutionWatcher] >>> [tdt] msg=Deleted record sheet{"tableName":"TDT_ROCK_TABLE_45aef1f3ac4847d396ca5d9f8050b2e4","isView":false,"engineUuid":"f8f534cd176b4290893e0137b7eedfc4"} [2025-06-13 16:12:18.003] [INFO] [ForkJoinPool-2-worker-114] [i.t.transporter.web.schedule.SolutionWatcher] >>> [tdt] msg=change sub job status to FAILED, sub job id is ac8d6452-8534-4d21-942f-502544fc307a 以上是运行日志
最新发布
06-14
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值