raw cannot be resolved or is not a field解决

本文详细解析了Android app在编译过程中遇到的rawcannotberesolvedorisnotafield错误原因及解决方法,包括检查raw目录位置和清理项目缓存。

Android app编译报错如下

raw cannot be resolved or is not a field

出现该错误只有如下两个可能

  1. raw目录没有在res下,资源文件没有在raw下
  2. App编译缓存导致
    AndroidStudio解决办法 Clean Project–>重新Build即可
    Android源码编译解决办法 删除编译产生的中间文件out/target/common/obj/APPS/xxx_intermediates/,重新模块编译即可
25/03/21 17:20:52 INFO TaskSchedulerImpl: Killing all running tasks in stage 0: Stage finished 25/03/21 17:20:52 INFO DAGScheduler: Job 0 finished: json at NativeMethodAccessorImpl.java:0, took 5.264169 s 任务执行失败: [UNRESOLVED_COLUMN.WITHOUT_SUGGESTION] A column or function parameter with name `url` cannot be resolved. ; 'Project ['url, 'title, 'body, 'date] +- Project +- Filter isnull(_corrupt_record#8) +- Relation [_corrupt_record#8] json 25/03/21 17:20:52 INFO SparkContext: SparkContext is stopping with exitCode 0. 25/03/21 17:20:52 INFO SparkUI: Stopped Spark web UI at http://node11:4040 25/03/21 17:20:53 INFO StandaloneSchedulerBackend: Shutting down all executors 25/03/21 17:20:53 INFO StandaloneSchedulerBackend$StandaloneDriverEndpoint: Asking each executor to shut down 25/03/21 17:20:53 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 25/03/21 17:20:53 INFO MemoryStore: MemoryStore cleared 25/03/21 17:20:53 INFO BlockManager: BlockManager stopped 25/03/21 17:20:53 INFO BlockManagerMaster: BlockManagerMaster stopped 25/03/21 17:20:53 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 25/03/21 17:20:53 INFO SparkContext: Successfully stopped SparkContext Traceback (most recent call last): File "/usr/local/Dissert/data_cleaning.py", line 91, in <module> cleaned_df = load_and_clean_data() File "/usr/local/Dissert/data_cleaning.py", line 34, in load_and_clean_data cleaned_df = cleaned_df.select(*required_columns) \ File "/usr/local/soft/spark/python/lib/pyspark.zip/pyspark/sql/dataframe.py", line 3229, in select File "/usr/local/soft/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1323, in __call__ File "/usr/local/soft/spark/python/lib/pyspark.zip/pyspark/errors/exceptions/captured.py", line 185, in deco pyspark.errors.exceptions.captured.AnalysisException: [UNRESOLVED_COLUMN.WITHOUT_SUGGESTION] A column or function parameter with name `url` cannot be resolved. ; 'Project ['url, 'title, 'body, 'date] +- Project +- Filter isnull(_corrupt_record#8) +- Relation [_corrupt_record#8] json 25/03/21 17:20:53 INFO ShutdownHookManager: Shutdown hook called 25/03/21 17:20:53 INFO ShutdownHookManager: Deleting directory /tmp/spark-53c44611-c7ab-4cbf-bd5d-d558cce88a84 25/03/21 17:20:53 INFO ShutdownHookManager: Deleting directory /tmp/
03-22
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值