【Android】Error:Execution failed for task ':app:mergeDebugResources'. > String index out of range: 0

本文介绍了一次Android项目编译过程中遇到的StringIndexOutOfBoundsException异常。该异常由strings.xml文件中的空字符串资源引起,通过检查并修复该文件解决了问题。

异常提示:

Error:Execution failed for task ‘:app:mergeDebugResources’.
> String index out of range: 0

错误原因:

有一次在进行编译的时候,Android Studio 给我报出了这样一个异常,一开始的我有点摸不着头脑,因为 fail 在 mergeDebugResources ,那么项目很可能出错在 res 资源目录中,不过后面的 String index out of range:0 一时让我搞不明白是什么意思,我先仔细检查了下 res/drawable-xx 等目录,发现这些资源文件都是没有问题的,然后开始检查 res/values 下的各项资源文件,styles.xml正常,color.xml正常,最后在strings.xml文件中发现了异常所在,原来我在上次修改的时候想要添加一处新的文本,但是只写出了:
<string name=""></string>
如果只是 value 没有文本的话是项目编译和运行都是没有问题的,但是我在这里连 key 都忘了定义,Android Studio在编译前检查的时候发现这里没有文本所需要的key,于是给我报了 String index out of range:0 的异常。
在 string.xml 中删掉或补充这一条就解决了这个异常,以后还需要多加细心啊。

hive> select > floor(cast(col3 as bigint) / (1000 * 60 * 60)) as hour_range, > count(*) as record_count > from media_index > where col3 rlike '^[0-9]+$' -- 仅保留数字记录 > group by floor(cast(col3 as bigint) / (1000 * 60 * 60)); Query ID = ccd_20251214183915_362370c6-1398-4a55-b844-25275b2a71d8 Total jobs = 2 Launching Job 1 out of 2 Number of reduce tasks not specified. Defaulting to jobconf value of: 5 In order to change the average load for a reducer (in bytes): set hive.exec.reducers.bytes.per.reducer=<number> In order to limit the maximum number of reducers: set hive.exec.reducers.max=<number> In order to set a constant number of reducers: set mapreduce.job.reduces=<number> Job running in-process (local Hadoop) 2025-12-14 18:39:17,021 Stage-1 map = 0%, reduce = 0% Ended Job = job_local1836596976_0005 with errors Error during job, obtaining debugging information... FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask MapReduce Jobs Launched: Stage-Stage-1: HDFS Read: 0 HDFS Write: 0 FAIL Total MapReduce CPU Time Spent: 0 msec hive> -- 启用Fetch模式,统计待机记录(>5分钟=300000毫秒) hive> set hive.fetch.task.conversion=more; hive> select count(*) as standby_record_count > from media_index > where col3 rlike '^[0-9]+$' -- 过滤非数字异常值 > and cast(col3 as bigint) > 300000; Query ID = ccd_20251214183926_ee3f8b3f-69bd-4bd6-ba56-f39639222430 Total jobs = 1 Launching Job 1 out of 1 Number of reduce tasks determined at compile time: 1 In order to change the average load for a reducer (in bytes): set hive.exec.reducers.bytes.per.reducer=<number> In order to limit the maximum number of reducers: set hive.exec.reducers.max=<number> In order to set a constant number of reducers: set mapreduce.job.reduces=<number> Job running in-process (local Hadoop) 2025-12-14 18:39:27,726 Stage-1 map = 0%, reduce = 0% Ended Job = job_local1568795444_0006 with errors Error during job, obtaining debugging information... FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask MapReduce Jobs Launched: Stage-Stage-1: HDFS Read: 0 HDFS Write: 0 FAIL Total MapReduce CPU Time Spent: 0 msec
最新发布
12-16
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值