spark连接es的问题java.lang.ClassNotFoundException:org.elasticsearch.client.transport.NoNodeAvailableExcep

跑spark任务:

daily.out.20171012的日志中有如下错误:

17/10/17 15:51:06 WARNspark.ThrowableSerializationWrapper: Task exception could not be deserialized

java.lang.ClassNotFoundException:org.elasticsearch.client.transport.NoNodeAvailableException

         atjava.net.URLClassLoader$1.run(URLClassLoader.java:366)

         atjava.net.URLClassLoader$1.run(URLClassLoader.java:355)

         atjava.security.AccessController.doPrivileged(Native Method)

         atjava.net.URLClassLoader.findClass(URLClassLoader.java:354)

         atjava.lang.ClassLoader.loadClass(ClassLoader.java:425)

         atsun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)

         atjava.lang.ClassLoader.loadClass(ClassLoader.java:358)

         atjava.lang.Class.forName0(Native Method)

         atjava.lang.Class.forName(Class.java:270)

         atorg.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67)

         atjava.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)

         atjava.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)

         atjava.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1771)

         atjava.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)

         atjava.io.ObjectInputStream.readObject(ObjectInputStream.java:370)

         atorg.apache.spark.ThrowableSerializationWrapper.readObject(TaskEndReason.scala:167)

         atsun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

         atsun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

         atsun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

         atjava.lang.reflect.Method.invoke(Method.java:606)

         atjava.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)

         atjava.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1893)

         atjava.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)

         atjava.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)

         atjava.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)

         atjava.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)

         atjava.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)

         atjava.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)

         atjava.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1990)

         atjava.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1915)

         atjava.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1798)

         atjava.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)

         atjava.io.ObjectInputStream.readObject(ObjectInputStream.java:370)

         atorg.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:72)

         atorg.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:98)

         atorg.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply$mcV$sp(TaskResultGetter.scala:108)

         atorg.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply(TaskResultGetter.scala:105)

         atorg.apache.spark.scheduler.TaskResultGetter$$anon$3$$anonfun$run$2.apply(TaskResultGetter.scala:105)

         atorg.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1699)

         atorg.apache.spark.scheduler.TaskResultGetter$$anon$3.run(TaskResultGetter.scala:105)

         atjava.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)

         atjava.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)

         atjava.lang.Thread.run(Thread.java:745)

17/10/17 15:51:07 INFOstorage.BlockManager: BlockManager stopped

17/10/17 15:51:07 INFOstorage.BlockManagerMaster: BlockManagerMaster stopped

17/10/17 15:51:07 INFOscheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:OutputCommitCoordinator stopped!

17/10/17 15:51:07 INFOremote.RemoteActorRefProvider$RemotingTerminator: Shutting down remote daemon.

17/10/17 15:51:07 INFO remote.RemoteActorRefProvider$RemotingTerminator:Remote daemon shut down; proceeding with flushing remote transports.

17/10/17 15:51:07 INFO spark.SparkContext:Successfully stopped SparkContext

Exception in thread "main"org.apache.spark.SparkException: Job aborted due to stage failure: Task 8 instage 20.0 failed 4 times, most recent failure: Lost task 8.3 in stage 20.0(TID 65, zjhydsz11.shbank.com): UnknownReason

Driver stacktrace:

         atorg.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1294)

         atorg.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1282)

         atorg.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1281)

         atscala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)

         atscala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)

         atorg.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1281)

         atorg.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697)

         atorg.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697)

         atscala.Option.foreach(Option.scala:236)

      

 

进spark界面进一步查看stderr中,可以进一步发现如下问题:

 

17/10/17 16:40:05 INFOelasticsearch.plugins: [Ritchie Gilmore] loaded [], sites []

17/10/17 16:40:05 INFOelasticsearch.plugins: [Jarella] loaded [], sites []

17/10/17 16:40:06 ERROR client.transport:[Jarella] failed to get node info for{#transport#-1}{10.240.44.40}{10.240.44.40:9300}, disconnecting...

RemoteTransportException[[Failed todeserialize response of type[org.elasticsearch.action.admin.cluster.node.liveness.LivenessResponse]]];nested: TransportSerializationException[Failed to deserialize response of type[org.elasticsearch.action.admin.cluster.node.liveness.LivenessResponse]];nested: ExceptionInInitializerError; nested: IllegalArgumentException[An SPIclass of type org.apache.lucene.codecs.PostingsFormat with name 'Lucene50' doesnot exist.  You need to add thecorresponding JAR file supporting this SPI to your classpath.  The current classpath supports the followingnames: [es090, completion090, XBloomFilter]];

Caused by:TransportSerializationException[Failed to deserialize response of type[org.elasticsearch.action.admin.cluster.node.liveness.LivenessResponse]];nested: ExceptionInInitializerError; nested: IllegalArgumentException[An SPIclass of type org.apache.lucene.codecs.PostingsFormat with name 'Lucene50' doesnot exist.  You need to add thecorresponding JAR file supporting this SPI to your classpath.  The current classpath supports the followingnames: [es090, completion090, XBloomFilter]];

         atorg.elasticsearch.transport.netty.MessageChannelHandler.handleResponse(MessageChannelHandler.java:179)

         atorg.elasticsearch.transport.netty.MessageChannelHandler.messageReceived(MessageChannelHandler.java:138)

         attranswarp.org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)

         attranswarp.org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)

         attranswarp.org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)

         attranswarp.org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296)

         attranswarp.org.jboss.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462)

         attranswarp.org.jboss.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443)

         attranswarp.org.jboss.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303)

         attranswarp.org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)

         attranswarp.org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)

         attranswarp.org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559)

         attranswarp.org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268)

         attranswarp.org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255)

         attranswarp.org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88)

         attranswarp.org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108)

         attranswarp.org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:337)

         attranswarp.org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89)

         attranswarp.org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178)

         attranswarp.org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108)

         attranswarp.org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42)

         atjava.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)

         atjava.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)

         atjava.lang.Thread.run(Thread.java:745)

Caused by:java.lang.ExceptionInInitializerError

         atorg.elasticsearch.Version.fromId(Version.java:500)

         atorg.elasticsearch.Version.readVersion(Version.java:276)

         atorg.elasticsearch.cluster.node.DiscoveryNode.readFrom(DiscoveryNode.java:326)

         atorg.elasticsearch.cluster.node.DiscoveryNode.readNode(DiscoveryNode.java:309)

         atorg.elasticsearch.action.admin.cluster.node.liveness.LivenessResponse.readFrom(LivenessResponse.java:52)

         atorg.elasticsearch.transport.netty.MessageChannelHandler.handleResponse(MessageChannelHandler.java:177)

         ...23 more

Caused by:java.lang.IllegalArgumentException: An SPI class of typeorg.apache.lucene.codecs.PostingsFormat with name 'Lucene50' does notexist.  You need to add the correspondingJAR file supporting this SPI to your classpath. The current classpath supports the following names: [es090,completion090, XBloomFilter]

         atorg.apache.lucene.util.NamedSPILoader.lookup(NamedSPILoader.java:109)

         atorg.apache.lucene.codecs.PostingsFormat.forName(PostingsFormat.java:112)

         atorg.elasticsearch.common.lucene.Lucene.<clinit>(Lucene.java:103)

         ...29 more

17/10/17 16:40:06 INFO executor.Executor:Finished task 4.0 in stage 20.0 (TID 46). 1560 bytes result sent to driver

17/10/17 16:40:06 INFO executor.Executor:Finished task 1.0 in stage 20.0 (TID 43). 1560 bytes result sent to driver

17/10/17 16:40:06 INFO executor.CoarseGrainedExecutorBackend:Got assigned task 50

17/10/17 16:40:06 INFO executor.Executor:Running task 3.1 in stage 20.0 (TID 50)

17/10/17 16:40:06 INFOexecutor.CoarseGrainedExecutorBackend: Got assigned task 51

17/10/17 16:40:06 INFO executor.Executor:Running task 8.0 in stage 20.0 (TID 51)

17/10/17 16:40:06 INFOstorage.ShuffleBlockFetcherIterator: Getting 7 non-empty blocks out of 16blocks

17/10/17 16:40:06 INFOstorage.ShuffleBlockFetcherIterator: Getting 1 non-empty blocks out of 16blocks

17/10/17 16:40:06 INFOstorage.ShuffleBlockFetcherIterator: Started 0 remote fetches in 0 ms

17/10/17 16:40:06 INFOstorage.ShuffleBlockFetcherIterator: Started 1 remote fetches in 1 ms

17/10/17 16:40:06 INFOelasticsearch.plugins: [Ultragirl] loaded [], sites []

17/10/17 16:40:06 INFOelasticsearch.plugins: [Karma] loaded [], sites []

17/10/17 16:40:06 ERROR client.transport:[Karma] failed to get node info for{#transport#-1}{10.240.44.40}{10.240.44.40:9300}, disconnecting...

RemoteTransportException[[Failed todeserialize response of type[org.elasticsearch.action.admin.cluster.node.liveness.LivenessResponse]]];nested: TransportSerializationException[Failed to deserialize response of type[org.elasticsearch.action.admin.cluster.node.liveness.LivenessResponse]];nested: NoClassDefFoundError[Could not initialize classorg.elasticsearch.common.lucene.Lucene];

Caused by:TransportSerializationException[Failed to deserialize response of type[org.elasticsearch.action.admin.cluster.node.liveness.LivenessResponse]];nested: NoClassDefFoundError[Could not initialize classorg.elasticsearch.common.lucene.Lucene];

         atorg.elasticsearch.transport.netty.MessageChannelHandler.handleResponse(MessageChannelHandler.java:179)

         atorg.elasticsearch.transport.netty.MessageChannelHandler.messageReceived(MessageChannelHandler.java:138)

         attranswarp.org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)

         attranswarp.org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)

         attranswarp.org.jboss.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791)

         attranswarp.org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:296)

         attranswarp.org.jboss.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:462)

         attranswarp.org.jboss.netty.handler.codec.frame.FrameDecoder.callDecode(FrameDecoder.java:443)

         attranswarp.org.jboss.netty.handler.codec.frame.FrameDecoder.messageReceived(FrameDecoder.java:303)

         attranswarp.org.jboss.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70)

         attranswarp.org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564)

         attranswarp.org.jboss.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559)

         attranswarp.org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:268)

         attranswarp.org.jboss.netty.channel.Channels.fireMessageReceived(Channels.java:255)

         attranswarp.org.jboss.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88)

         attranswarp.org.jboss.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108)

         attranswarp.org.jboss.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:337)

         attranswarp.org.jboss.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89)

         attranswarp.org.jboss.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178)

         attranswarp.org.jboss.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108)

         attranswarp.org.jboss.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42)

         atjava.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)

         atjava.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)

         atjava.lang.Thread.run(Thread.java:745)

Caused by: java.lang.NoClassDefFoundError:Could not initialize class org.elasticsearch.common.lucene.Lucene

         atorg.elasticsearch.Version.fromId(Version.java:500)

         atorg.elasticsearch.Version.readVersion(Version.java:276)

         atorg.elasticsearch.cluster.node.DiscoveryNode.readFrom(DiscoveryNode.java:326)

         atorg.elasticsearch.cluster.node.DiscoveryNode.readNode(DiscoveryNode.java:309)

         atorg.elasticsearch.action.admin.cluster.node.liveness.LivenessResponse.readFrom(LivenessResponse.java:52)

         atorg.elasticsearch.transport.netty.MessageChannelHandler.handleResponse(MessageChannelHandler.java:177)

         ...23 more


 

google了一下,https://stackoverflow.com/questions/39023903/elasticsearch-transportclient-fails-with-could-not-initialize-class-org-elastics

 

其中一个回答:

The explanation of this problem isdescribed here(https://github.com/elastic/elasticsearch/issues/3350). In short,ElasticSearch and it's lucene dependencies have conflicting files inMETA-INF/services and when combined in a single jar they are overwritten by oneanother.

 

If you want to create a single jarcontaining your application and all dependencies, you should not usemaven-assembly-plugin for that because it can not deal with META-INF/servicesstructure which is required by Lucene jars.

If you're encountering this problem you'reprobably building a single jar. If you're using maven you can solve this usingthe shade plugin to merge these files and prevent them from gettingoverwritten:

<plugin>

               <groupId>org.apache.maven.plugins</groupId>

               <artifactId>maven-shade-plugin</artifactId>

                <executions>

                    <execution>

                       <phase>package</phase>

                        <goals>

                           <goal>shade</goal>

                        </goals>

                        <configuration>

                           <transformers>

                                <transformerimplementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">

                                   <resource>META-INF/services/org.apache.lucene.codecs.Codec</resource>

                               </transformer>

                                <transformerimplementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">

                                   <resource>META-INF/services/org.apache.lucene.codecs.DocValuesFormat</resource>

                               </transformer>

                                <transformerimplementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">

                                    <resource>META-INF/services/org.apache.lucene.codecs.PostingsFormat</resource>

                               </transformer>

                           </transformers>

                        <shadedArtifactAttached>true</shadedArtifactAttached>

                           <shadedClassifierName>fat</shadedClassifierName>

                        </configuration>

                    </execution>

                </executions>

           </plugin>

故在E:\pom.xml中添加如下:

<transformers>

                                <transformerimplementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">

                                   <resource>META-INF/services/org.apache.lucene.codecs.Codec</resource>

                                </transformer>

                                <transformerimplementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">

                                   <resource>META-INF/services/org.apache.lucene.codecs.DocValuesFormat</resource>

                               </transformer>

                                <transformerimplementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">

                                   <resource>META-INF/services/org.apache.lucene.codecs.PostingsFormat</resource>

                               </transformer>

  就不再报elasticsearch的错误了!!                             

 

 

 

 

 

 

 


 

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值