java版spark使用faltmap时报空指针错误,错误如下:
20/08/28 09:41:44 INFO DAGScheduler: ResultStage 0 (count at TestJob.java:252) failed in 3.500 s due to Job aborted due to stage failure: Task 299 in stage 0.0 failed 1 times, most recent failure: Lost task 299.0 in stage 0.0 (TID 299, localhost, executor driver): java.lang.NullPointerException
at scala.collection.convert.Wrappers$JIteratorWrapper.hasNext(Wrappers.scala:42)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:439)
at org.apache.spark.util.Utils$.getIteratorSize(Utils.scala:1817)
at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1168)
at org.apache.spark.rdd.RDD$$anonfun$count$1.apply(RDD.scala:1168)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2113)
at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2113)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:121)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$11.apply(Executor.scala:407)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:413)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Driver stacktrace:
20/08/28 09:41:44 INFO DAGScheduler: Job 0 failed: count at TestJob.java:252, took 3.555955 s
Exception in thread "main" org.apache.spark