我在ide里面运行一个spark的程序,代码如下:
import org.apache.spark.SparkConf;
import org.apache.spark.api.java.JavaSparkContext;
public class testSpark {
public static void main(String[] args) {
SparkConf conf = new SparkConf().setAppName("Testing").setMaster("local");
JavaSparkContext sc = new JavaSparkContext(conf);
System.out.println(sc.appName());
}
}
但是当我运行的时候报错了:
Exception in thread "main" java.lang.NoClassDefFoundError: scala/xml/MetaData
at org.apache.spark.ui.jobs.JobsTab.(JobsTab.scala:30)
at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:50)
at org.apache.spark.ui.SparkUI.(SparkUI.scala:61)
at iScope.testSpark.main(testSpark.java:9)
Caused by: java.lang.ClassNotFoundException: scala.xml.MetaData
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
我的pom文件:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>1.2.1</version>
</dependency>
<dependency>
<groupId>com.google.guava</groupId>
<artifactId>guava</artifactId>
<version>18.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-mllib_2.11</artifactId>
<version>1.2.1</version>
</dependency>
解决办法是在pom文件中引入scala.xml的jar包:
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-xml</artifactId>
<version>2.11.0-M4</version>
</dependency>
原文地址: https://stackoverflow.com/questions/30049797/noclassdeffound-scala-xml-metadata