1.错误描述:
org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
org.apache.spark.SparkContext.<init>(SparkContext.scala:75)
recommend_util$.<init>(recommend_util.scala:10)
recommend_util$.<clinit>(recommend_util.scala)
recommend_demo1$.main(recommend_demo1.scala:11)
recommend_demo1.main(recommend_demo1.scala)
at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$2.apply(SparkContext.scala:2456)
at org.apache.spark.SparkContext$$anonfun$assertNoOtherContextIsRunning$2.apply(SparkContext.scala:2452)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.SparkContext$.assertNoOtherContextIsRunning(SparkContext.scala:2452)
at org.apache.spark.SparkContext$.setActiveContext(SparkContext.scala:2554)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:2408)
at recommend_demo1$.main(recommend_demo1.scala:15)
at recommend_demo1.main(recommend_demo1.scala)
解决办法:
sparkConf.set("spark.driver.allowMultipleContexts","true");
本文介绍了解决Apache Spark中出现的错误:在同一JVM中只允许运行一个SparkContext的问题。通过设置sparkConf参数允许多个SparkContext同时运行。
1192

被折叠的 条评论
为什么被折叠?



