Android Project - R cannot be resolved or is not a field———解决方案之一!

本文介绍了两个常见的Android应用开发中的兼容性问题及其解决方案:一是确保AndroidManifest.xml文件正确配置了minSdkVersion和targetSdkVersion;二是正确导入并使用appcompat_v7库,通过将项目设置为使用该库作为Library来解决兼容性问题。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

情况1、在Android项目中的AndroidManifest.xml中漏掉了红色的部分

<uses-sdk

        android:minSdkVersion="14"

        android:targetSdkVersion="21" />

情况2、appcompat_v7 没有import:

  

      正确过程:

      

      

     每个项目要把appcompat_v7当做它的一个Library:

     

    再次clean--> Build all

     

25/03/21 17:40:11 INFO BlockManagerInfo: Removed broadcast_1_piece0 on node11:42532 in memory (size: 7.6 KiB, free: 366.3 MiB) 任务执行失败: [UNRESOLVED_COLUMN.WITHOUT_SUGGESTION] A column or function parameter with name `title` cannot be resolved. ; 'Project ['title, 'body, 'date] +- Project +- Filter isnull(_corrupt_record#8) +- Relation [_corrupt_record#8] json 25/03/21 17:40:11 INFO SparkContext: SparkContext is stopping with exitCode 0. 25/03/21 17:40:11 INFO SparkUI: Stopped Spark web UI at http://node11:4040 25/03/21 17:40:11 INFO StandaloneSchedulerBackend: Shutting down all executors 25/03/21 17:40:11 INFO StandaloneSchedulerBackend$StandaloneDriverEndpoint: Asking each executor to shut down 25/03/21 17:40:11 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 25/03/21 17:40:11 INFO MemoryStore: MemoryStore cleared 25/03/21 17:40:11 INFO BlockManager: BlockManager stopped 25/03/21 17:40:12 INFO BlockManagerMaster: BlockManagerMaster stopped 25/03/21 17:40:12 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 25/03/21 17:40:12 INFO SparkContext: Successfully stopped SparkContext Traceback (most recent call last): File "/usr/local/Dissert/data_cleaning.py", line 80, in <module> cleaned_df = load_and_clean_data() File "/usr/local/Dissert/data_cleaning.py", line 31, in load_and_clean_data cleaned_df = cleaned_df.select(*required_columns) \ File "/usr/local/soft/spark/python/lib/pyspark.zip/pyspark/sql/dataframe.py", line 3229, in select File "/usr/local/soft/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1323, in __call__ File "/usr/local/soft/spark/python/lib/pyspark.zip/pyspark/errors/exceptions/captured.py", line 185, in deco pyspark.errors.exceptions.captured.AnalysisException: [UNRESOLVED_COLUMN.WITHOUT_SUGGESTION] A column or function parameter with name `title` cannot be resolved. ; 'Project ['title, 'body, 'date] +- Project +- Filter isnull(_corrupt_record#8) +- Relation [_corrupt_record#8] json 25/03/21 17:40:12 INFO ShutdownHookManager: Shutdown hook called
最新发布
03-22
25/03/21 17:20:52 INFO TaskSchedulerImpl: Killing all running tasks in stage 0: Stage finished 25/03/21 17:20:52 INFO DAGScheduler: Job 0 finished: json at NativeMethodAccessorImpl.java:0, took 5.264169 s 任务执行失败: [UNRESOLVED_COLUMN.WITHOUT_SUGGESTION] A column or function parameter with name `url` cannot be resolved. ; 'Project ['url, 'title, 'body, 'date] +- Project +- Filter isnull(_corrupt_record#8) +- Relation [_corrupt_record#8] json 25/03/21 17:20:52 INFO SparkContext: SparkContext is stopping with exitCode 0. 25/03/21 17:20:52 INFO SparkUI: Stopped Spark web UI at http://node11:4040 25/03/21 17:20:53 INFO StandaloneSchedulerBackend: Shutting down all executors 25/03/21 17:20:53 INFO StandaloneSchedulerBackend$StandaloneDriverEndpoint: Asking each executor to shut down 25/03/21 17:20:53 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 25/03/21 17:20:53 INFO MemoryStore: MemoryStore cleared 25/03/21 17:20:53 INFO BlockManager: BlockManager stopped 25/03/21 17:20:53 INFO BlockManagerMaster: BlockManagerMaster stopped 25/03/21 17:20:53 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 25/03/21 17:20:53 INFO SparkContext: Successfully stopped SparkContext Traceback (most recent call last): File "/usr/local/Dissert/data_cleaning.py", line 91, in <module> cleaned_df = load_and_clean_data() File "/usr/local/Dissert/data_cleaning.py", line 34, in load_and_clean_data cleaned_df = cleaned_df.select(*required_columns) \ File "/usr/local/soft/spark/python/lib/pyspark.zip/pyspark/sql/dataframe.py", line 3229, in select File "/usr/local/soft/spark/python/lib/py4j-0.10.9.7-src.zip/py4j/java_gateway.py", line 1323, in __call__ File "/usr/local/soft/spark/python/lib/pyspark.zip/pyspark/errors/exceptions/captured.py", line 185, in deco pyspark.errors.exceptions.captured.AnalysisException: [UNRESOLVED_COLUMN.WITHOUT_SUGGESTION] A column or function parameter with name `url` cannot be resolved. ; 'Project ['url, 'title, 'body, 'date] +- Project +- Filter isnull(_corrupt_record#8) +- Relation [_corrupt_record#8] json 25/03/21 17:20:53 INFO ShutdownHookManager: Shutdown hook called 25/03/21 17:20:53 INFO ShutdownHookManager: Deleting directory /tmp/spark-53c44611-c7ab-4cbf-bd5d-d558cce88a84 25/03/21 17:20:53 INFO ShutdownHookManager: Deleting directory /tmp/
03-22
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值