Spark程序编译报错error: object apache is not a member of package org

在尝试编译Spark程序时遇到错误,提示'object apache is not a member of package org'。错误出现在导入Spark相关包的地方,包括SparkConf和SparkContext。问题可能由于本地Maven仓库路径过长或深层导致,解决方案是将本地仓库路径改为更简洁的位置,如'E:StudyBigData epository'。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

Spark程序编译报错:

[INFO] Compiling 2 source files to E:\Develop\IDEAWorkspace\spark\target\classes at 1567004370534
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:3: error: object apache is not a member of package org
[ERROR] import org.apache.spark.rdd.RDD
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:4: error: object apache is not a member of package org
[ERROR] import org.apache.spark.{SparkConf, SparkContext}
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:12: error: not found: type SparkConf
[ERROR] val sparkConf: SparkConf = new SparkConf().setAppName("WordCount").setMaster("local[2]")
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:12: error: not found: type SparkConf
[ERROR] val sparkConf: SparkConf = new SparkConf().setAppName("WordCount").setMaster("local[2]")
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:14: error: not found: type SparkContext
[ERROR] val sc: SparkContext = new SparkContext(sparkConf)
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:14: error: not found: type SparkContext
[ERROR] val sc: SparkContext = new SparkContext(sparkConf)
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:18: error: not found: type RDD
[ERROR] val data: RDD[String] = sc.textFile("E:\\Study\\BigData\\heima\\stage5\\2spark����\\words.txt")
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:20: error: not found: type RDD
[ERROR] val words: RDD[String] = data.flatMap(_.split(" "))
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:22: error: not found: type RDD
[ERROR] val wordToOne: RDD[(String, Int)] = words.map((_,1))
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:24: error: not found: type RDD
[ERROR] val result: RDD[(String, Int)] = wordToOne.reduceByKey(_+_)
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCount.scala:27: error: not found: type RDD
[ERROR] val ascResult: RDD[(String, Int)] = result.sortBy(_._2,false) //����
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCountCluster.scala:3: error: object apache is not a member of package org
[ERROR] import org.apache.spark.{SparkConf, SparkContext}
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCountCluster.scala:4: error: object apache is not a member of package org
[ERROR] import org.apache.spark.rdd.RDD
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCountCluster.scala:12: error: not found: type SparkConf
[ERROR] val sparkConf: SparkConf = new SparkConf().setAppName("WordCountCluster")
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCountCluster.scala:12: error: not found: type SparkConf
[ERROR] val sparkConf: SparkConf = new SparkConf().setAppName("WordCountCluster")
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCountCluster.scala:14: error: not found: type SparkContext
[ERROR] val sc: SparkContext = new SparkContext(sparkConf)
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCountCluster.scala:14: error: not found: type SparkContext
[ERROR] val sc: SparkContext = new SparkContext(sparkConf)
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCountCluster.scala:18: error: not found: type RDD
[ERROR] val data: RDD[String] = sc.textFile(args(0))
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCountCluster.scala:20: error: not found: type RDD
[ERROR] val words: RDD[String] = data.flatMap(_.split(" "))
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCountCluster.scala:22: error: not found: type RDD
[ERROR] val wordToOne: RDD[(String, Int)] = words.map((_,1))
[ERROR] ^
[ERROR] E:\Develop\IDEAWorkspace\spark\src\main\scala\cn\itcast\wordCount\WordCountCluster.scala:24: error: not found: type RDD
[ERROR] val result: RDD[(String, Int)] = wordToOne.reduceByKey(_+_)
[ERROR] ^
[ERROR] 21 errors found
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE

 

原因:本地仓库有问题。很可能是原来的本地仓库路径太长了太深了,仓库本身没问题,因为我把原来的仓库拷贝到E:\Study\BigData\目录下,就能正常使用。

 

解决方法:

原来spark工程的maven本地仓库是:E:\Study\BigData\heima\stage5\1scala\scala3\spark课程需要的maven仓库\SparkRepository

后来我修改为:E:\Study\BigData\repository   就可以了。

 

转载于:https://www.cnblogs.com/mediocreWorld/p/11427088.html

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值