Spark打包成jar在shell能成功,但以spark-submit却运行失败

本文记录了一次使用Spark部署应用程序时遇到的版本兼容性问题,包括字符串操作异常、方法未找到等错误,并分享了解决这些问题的经验。

有的错误确实很莫名其妙

2020-10-28 15:13:21,199 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(root); groups with view permissions: Set(); users  with modify permissions: Set(root); groups with modify permissions: Set()
2020-10-28 15:13:21,676 INFO util.Utils: Successfully started service 'sparkDriver' on port 34943.
2020-10-28 15:13:21,717 INFO spark.SparkEnv: Registering MapOutputTracker
2020-10-28 15:13:21,764 INFO spark.SparkEnv: Registering BlockManagerMaster
2020-10-28 15:13:21,794 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
2020-10-28 15:13:21,794 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
2020-10-28 15:13:21,804 INFO spark.SparkEnv: Registering BlockManagerMasterHeartbeat
2020-10-28 15:13:21,821 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-4d3e2e6c-41ba-4687-b960-d5cfa509d670
2020-10-28 15:13:21,850 INFO memory.MemoryStore: MemoryStore started with capacity 413.9 MiB
2020-10-28 15:13:21,880 INFO spark.SparkEnv: Registering OutputCommitCoordinator
2020-10-28 15:13:22,013 INFO util.log: Logging initialized @3807ms to org.sparkproject.jetty.util.log.Slf4jLog
2020-10-28 15:13:22,122 INFO server.Server: jetty-9.4.z-SNAPSHOT; built: 2019-04-29T20:42:08.989Z; git: e1bc35120a6617ee3df052294e433f3a25ce7097; jvm 1.8.0_251-b08
2020-10-28 15:13:22,151 INFO server.Server: Started @3945ms
2020-10-28 15:13:22,196 INFO server.AbstractConnector: Started ServerConnector@2ca47471{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2020-10-28 15:13:22,196 INFO util.Utils: Successfully started service 'SparkUI' on port 4040.
2020-10-28 15:13:22,246 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2449cff7{/jobs,null,AVAILABLE,@Spark}
2020-10-28 15:13:22,248 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@73393584{/jobs/json,null,AVAILABLE,@Spark}
2020-10-28 15:13:22,249 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1827a871{/jobs/job,null,AVAILABLE,@Spark}
2020-10-28 15:13:22,252 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@66238be2{/jobs/job/json,null,AVAILABLE,@Spark}
2020-10-28 15:13:22,256 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@200606de{/stages,null,AVAILABLE,@Spark}
2020-10-28 15:13:22,257 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@f8908f6{/stages/json,null,AVAILABLE,@Spark}
2020-10-28 15:13:22,258 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2ef8a8c3{/stages/stage,null,AVAILABLE,@Spark}
2020-10-28 15:13:22,260 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7544a1e4{/stages/stage/json,null,AVAILABLE,@Spark}
2020-10-28 15:13:22,261 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7957dc72{/stages/pool,null,AVAILABLE,@Spark}
2020-10-28 15:13:22,262 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3aacf32a{/stages/pool/json,null,AVAILABLE,@Spark}
2020-10-28 15:13:22,263 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@82c57b3{/storage,null,AVAILABLE,@Spark}
2020-10-28 15:13:22,264 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@600b0b7{/storage/json,null,AVAILABLE,@Spark}
2020-10-28 15:13:22,271 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5ea502e0{/storage/rdd,null,AVAILABLE,@Spark}
2020-10-28 15:13:22,272 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@473b3b7a{/storage/rdd/json,null,AVAILABLE,@Spark}
2020-10-28 15:13:22,273 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@77b7ffa4{/environment,null,AVAILABLE,@Spark}
2020-10-28 15:13:22,275 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@402f80f5{/environment/json,null,AVAILABLE,@Spark}
2020-10-28 15:13:22,276 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@133e019b{/executors,null,AVAILABLE,@Spark}
2020-10-28 15:13:22,277 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7dac3fd8{/executors/json,null,AVAILABLE,@Spark}
2020-10-28 15:13:22,278 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@2102a4d5{/executors/threadDump,null,AVAILABLE,@Spark}
2020-10-28 15:13:22,279 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3d4d3fe7{/executors/threadDump/json,null,AVAILABLE,@Spark}
2020-10-28 15:13:22,299 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@51684e4a{/static,null,AVAILABLE,@Spark}
2020-10-28 15:13:22,301 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6cc0bcf6{/,null,AVAILABLE,@Spark}
2020-10-28 15:13:22,306 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@32f61a31{/api,null,AVAILABLE,@Spark}
2020-10-28 15:13:22,307 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3dddbe65{/jobs/job/kill,null,AVAILABLE,@Spark}
2020-10-28 15:13:22,308 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@344561e0{/stages/stage/kill,null,AVAILABLE,@Spark}
2020-10-28 15:13:22,310 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://node1:4040
2020-10-28 15:13:22,367 INFO spark.SparkContext: Added JAR file:/root/SparkTest.jar at spark://node1:34943/jars/SparkTest.jar with timestamp 1603869202367
2020-10-28 15:13:22,618 INFO executor.Executor: Starting executor ID driver on host node1
2020-10-28 15:13:22,660 INFO util.Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 35979.
2020-10-28 15:13:22,660 INFO netty.NettyBlockTransferService: Server created on node1:35979
2020-10-28 15:13:22,662 INFO storage.BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
2020-10-28 15:13:22,669 INFO storage.BlockManagerMaster: Registering BlockManager BlockManagerId(driver, node1, 35979, None)
2020-10-28 15:13:22,672 INFO storage.BlockManagerMasterEndpoint: Registering block manager node1:35979 with 413.9 MiB RAM, BlockManagerId(driver, node1, 35979, None)
2020-10-28 15:13:22,677 INFO storage.BlockManagerMaster: Registered BlockManager BlockManagerId(driver, node1, 35979, None)
2020-10-28 15:13:22,678 INFO storage.BlockManager: Initialized BlockManager: BlockManagerId(driver, node1, 35979, None)
2020-10-28 15:13:22,903 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@22fba58c{/metrics/json,null,AVAILABLE,@Spark}
2020-10-28 15:13:23,787 INFO memory.MemoryStore: Block broadcast_0 stored as values in memory (estimated size 303.2 KiB, free 413.6 MiB)
2020-10-28 15:13:23,901 INFO memory.MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 27.7 KiB, free 413.6 MiB)
2020-10-28 15:13:23,904 INFO storage.BlockManagerInfo: Added broadcast_0_piece0 in memory on node1:35979 (size: 27.7 KiB, free: 413.9 MiB)
2020-10-28 15:13:23,919 INFO spark.SparkContext: Created broadcast 0 from textFile at App.scala:13
Exception in thread "main" java.lang.NoClassDefFoundError: scala/collection/StringOps$
	at cn.edu.swpu.scs.App$.main(App.scala:16)
	at cn.edu.swpu.scs.App.main(App.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
	at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:928)
	at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
	at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1007)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1016)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: scala.collection.StringOps$
	at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
	at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
	... 14 more
2020-10-28 15:13:24,089 INFO spark.SparkContext: Invoking stop() from shutdown hook
2020-10-28 15:13:24,104 INFO server.AbstractConnector: Stopped Spark@2ca47471{HTTP/1.1,[http/1.1]}{0.0.0.0:4040}
2020-10-28 15:13:24,106 INFO ui.SparkUI: Stopped Spark web UI at http://node1:4040
2020-10-28 15:13:24,124 INFO spark.MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
2020-10-28 15:13:24,138 INFO memory.MemoryStore: MemoryStore cleared
2020-10-28 15:13:24,139 INFO storage.BlockManager: BlockManager stopped
2020-10-28 15:13:24,153 INFO storage.BlockManagerMaster: BlockManagerMaster stopped
2020-10-28 15:13:24,157 INFO scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
2020-10-28 15:13:24,166 INFO spark.SparkContext: Successfully stopped SparkContext
2020-10-28 15:13:24,166 INFO util.ShutdownHookManager: Shutdown hook called
2020-10-28 15:13:24,167 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-7fb5b978-7a59-471d-83ac-85ed271cd9c5
2020-10-28 15:13:24,170 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-c372076b-af9a-40c6-b48a-184f66a3c48d

显示java.lang.NoClassDefFoundError: scala/collection/StringOps$
Caused by: java.lang.ClassNotFoundException: scala.collection.StringOps
确实很无辜,shell是能运行的,且是正确结果,打包成jar却频频报错。

检查了一遍又一遍的代码,卡了半天,最后却发现是版本的问题!!!
大家在编写程序的时候一定要注意版本的兼容性问题啊!

记录一下我遇见的错误吧

1、字符串转换不支持

也就是toDouble,无法成功
toDouble在shell 能运行,提交jar则不行

2、

Exception in thread “main” java.lang.NoSuchMethodError: scala.Predef..conforms()Lscala/Predef$$less$

3、

NoSuchMethodError: scala.util.matching.Regex.<init>(Ljava/lang/String;Lscala/collection/Seq;)V

总之当检查很久了,确定不是代码的问题,一定要考虑版本!!!

评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值