我用的CENTER OS V6.2
在编译部署完SPARK 1.1.0和1.2.0后,发现WORKER总是起不来的问题,解决方法分享给大家,希望能有所参考。
错误信息:
SPARK 1.1.0
vm1: failed to launch org.apache.spark.deploy.worker.Worker:
vm1: at java.lang.Class.initializeClass(libgcj.so.10)
vm1: ...22 more
vm1: full log in /home/hadoop/spark/spark110_hadoop220/sbin/../logs/spark-hadoop-org.apache.spark.deploy.worker.Worker-1-vm1.out
SPARK 1.2.0
Exception in thread "main" java.lang.ClassFormatError: org.apache.spark.util.Utils$ (erroneous class name)
at java.lang.VMClassLoader.defineClass(libgcj.so.10)
at java.lang.ClassLoader.defineClass(libgcj.so.10)
at java.security.SecureClassLoader.defineClass(libgcj.so.10)
at java.net.URLClassLoader.findClass(libgcj.so.10)
at java.lang.ClassLoader.loadClass(libgcj.so.10)
at java.lang.ClassLoader.loadClass(libgcj.so.10)
at org.apache.spark.Logging$class.initializeLogging(Logging.scala:124)
at org.apache.spark.Logging$class.initializeIfNecessary(Logging.scala:107)
at org.apache.spark.Logging$class.log(Logging.scala:51)
at org.apache.spark.deploy.worker.Worker$.log(Worker.scala:470)
at org.apache.spark.deploy.worker.Worker$.main(Worker.scala:472)
at org.apache.spark.deploy.worker.Worker.main(Worker.scala)
都是JAVA问题,应该找到了OS安装时自带的JAVA,版本过低不支持。
在编译部署完SPARK 1.1.0和1.2.0后,发现WORKER总是起不来的问题,解决方法分享给大家,希望能有所参考。
错误信息:
SPARK 1.1.0
vm1: failed to launch org.apache.spark.deploy.worker.Worker:
vm1: at java.lang.Class.initializeClass(libgcj.so.10)
vm1: ...22 more
vm1: full log in /home/hadoop/spark/spark110_hadoop220/sbin/../logs/spark-hadoop-org.apache.spark.deploy.worker.Worker-1-vm1.out
SPARK 1.2.0
Exception in thread "main" java.lang.ClassFormatError: org.apache.spark.util.Utils$ (erroneous class name)
at java.lang.VMClassLoader.defineClass(libgcj.so.10)
at java.lang.ClassLoader.defineClass(libgcj.so.10)
at java.security.SecureClassLoader.defineClass(libgcj.so.10)
at java.net.URLClassLoader.findClass(libgcj.so.10)
at java.lang.ClassLoader.loadClass(libgcj.so.10)
at java.lang.ClassLoader.loadClass(libgcj.so.10)
at org.apache.spark.Logging$class.initializeLogging(Logging.scala:124)
at org.apache.spark.Logging$class.initializeIfNecessary(Logging.scala:107)
at org.apache.spark.Logging$class.log(Logging.scala:51)
at org.apache.spark.deploy.worker.Worker$.log(Worker.scala:470)
at org.apache.spark.deploy.worker.Worker$.main(Worker.scala:472)
at org.apache.spark.deploy.worker.Worker.main(Worker.scala)
都是JAVA问题,应该找到了OS安装时自带的JAVA,版本过低不支持。
通过在SPARK-ENV.SH中,设置JAVA_HOME可解决。
转自:http://f.dataguru.cn/thread-471416-1-1.html