-
Using Spark’s default log4j profile: org/apache/spark/log4j-defaults.properties
18/09/12 10:18:05 INFO SparkContext: Running Spark version 1.6.3
18/09/12 10:18:07 ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException
- A master URL must be set in your configuration
at org.apache.spark.SparkContext.(SparkContext.scala:401)
at org.apache.spark.SparkContext.(SparkContext.scala:123)
at com.keduox.Test01.main(Test01.scala:9)atcom.keduox.Test01.main(Test01.scala)18/09/1210:18:07INFOSparkContext:SuccessfullystoppedSparkContextExceptioninthread“main”org.apache.spark.SparkException:AmasterURLmustbesetinyourconfigurationatorg.apache.spark.SparkContext.(SparkContext.scala:401)atorg.apache.spark.SparkContext.(SparkContext.scala:123)atcom.keduox.Test01.main(Test01.scala:9)atcom.keduox.Test01.main(Test01.scala)18/09/1210:18:07INFOSparkContext:SuccessfullystoppedSparkContextExceptioninthread“main”org.apache.spark.SparkException:AmasterURLmustbesetinyourconfigurationatorg.apache.spark.SparkContext.(SparkContext.scala:401)atorg.apache.spark.SparkContext.(SparkContext.scala:123)atcom.keduox.Test01.main(Test01.scala:9)
at com.keduox.Test01.main(Test01.scala)
异常详情:A master URL must be set in your configuration
在configuration配置项中必须设置一个master的URL地址
//如果在集群: master:7077”
//如果在本地:localhost
conf.setMaster(“local”)
2.
Using Spark’s default log4j profile: org/apache/spark/log4j-defaults.properties
18/09/12 10:33:55 INFO SparkContext: Running Spark version 1.6.3
18/09/12 10:33:56 ERROR SparkContext: Error initializing SparkContext.
org.apache.spark.SparkException: An application name must be set in your configuration
at org.apache.spark.SparkContext.(SparkContext.scala:404)
at com.keduox.Test01.main(Test01.scala:15)atcom.keduox.Test01.main(Test01.scala)18/09/1210:33:56INFOSparkContext:SuccessfullystoppedSparkContextExceptioninthread“main”org.apache.spark.SparkException:Anapplicationnamemustbesetinyourconfigurationatorg.apache.spark.SparkContext.(SparkContext.scala:404)atcom.keduox.Test01.main(Test01.scala:15)atcom.keduox.Test01.main(Test01.scala)18/09/1210:33:56INFOSparkContext:SuccessfullystoppedSparkContextExceptioninthread“main”org.apache.spark.SparkException:Anapplicationnamemustbesetinyourconfigurationatorg.apache.spark.SparkContext.(SparkContext.scala:404)atcom.keduox.Test01.main(Test01.scala:15)
at com.keduox.Test01.main(Test01.scala)
异常详情:An application name must be set in your configuration
在配置项中需要加上应用的名称application name
//设置应用名称
conf.setAppName(“mytest01”)
3.
**Exception in thread “main”
java.lang.NoSuchMethodError: scala.collection.immutable.HashSet$.empty()Lscala/collection/immutable/HashSet;**
原因是:scala的版本不一致:
需要的2.10 项目的是2.11.7
scala的版本不一致。因为我们要求的是2.10,但是项目中配置的是2.11
在windows当中可以同时存在多个版本。没有必要去将原来的安装版本删除掉
直接将tgz的安装包,解压,在配置依赖包的地方,将2.11删除,在将2.10增加到spark的依赖包中