ubuntu+idea+spark+scala,编译项目报错java.net.BindException: Cannot assign requested address

本文针对Spark启动时出现的“Error initializing SparkContext”错误进行了详细解析,并提供了具体的解决步骤,包括检查IP地址、主机名配置及hosts文件设置等。

报错完整信息:

ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries! Consider explicitly setting the appropriate port for the service 'sparkDriver' (for example spark.ui.port for SparkUI) to an available port or increasing

解决方法:

1、在terminal下用ifconfig -a命令查看ip地址。虚拟机用户注意,此步一定要做,不要看自己所连网络的ip地址,一定要看虚拟机对应的ip地址。

2、查看主机名。terminal下hostname命令。

3、在/etc/hosts中修改ip配置,若ip地址为1.2.3.4,主机名为mahuateng,则文件中要有一行

1.2.3.4    mahuateng

4、重新编译即可。

java.lang.ArrayIndexOutOfBoundsException: 1 (JTeachers.java:24) (JavaRDDLike.scala:143) (JavaRDDLike.scala:143) (Iterator.scala:434) (Iterator.scala:440) (ExternalSorter.scala:191) (SortShuffleWriter.scala:63) (ShuffleMapTask.scala:96) (ShuffleMapTask.scala:53) (Task.scala:108) (Executor.scala:335) (ThreadPoolExecutor.java:1142) (ThreadPoolExecutor.java:617) (Thread.java:745) (TID 0, localhost, executor driver): java.lang.ArrayIndexOutOfBoundsException: 1 (JTeachers.java:24) (JavaRDDLike.scala:143) (JavaRDDLike.scala:143) (Iterator.scala:434) (Iterator.scala:440) (ExternalSorter.scala:191) (SortShuffleWriter.scala:63) (ShuffleMapTask.scala:96) (ShuffleMapTask.scala:53) (Task.scala:108) (Executor.scala:335) (ThreadPoolExecutor.java:1142) (ThreadPoolExecutor.java:617) (Thread.java:745) Driver stacktrace: (DAGScheduler.scala:1499) (DAGScheduler.scala:1487) (DAGScheduler.scala:1486) (ResizableArray.scala:59) (ArrayBuffer.scala:48) (DAGScheduler.scala:1486) (DAGScheduler.scala:814) (DAGScheduler.scala:814) (Option.scala:257) (DAGScheduler.scala:814) (DAGScheduler.scala:1714) (DAGScheduler.scala:1669) (DAGScheduler.scala:1658) (EventLoop.scala:48) (DAGScheduler.scala:630) (SparkContext.scala:2022) (SparkContext.scala:2043) (SparkContext.scala:2062) (SparkContext.scala:2087) (RDD.scala:936) (RDDOperationScope.scala:151) (RDDOperationScope.scala:112) (RDD.scala:362) (RDD.scala:935) (JavaRDDLike.scala:361) (JavaRDDLike.scala:45) (JTeachers.java:31) (Native Method) (NativeMethodAccessorImpl.java:62) (DelegatingMethodAccessorImpl.java:43) (Method.java:498) (SparkSubmit.scala:784) (SparkSubmit.scala:188) (SparkSubmit.scala:213) (SparkSubmit.scala:127) (SparkSubmit.scala) Caused by: java.lang.ArrayIndexOutOfBoundsException: 1 (JTeachers.java:24) (JavaRDDLike.scala:143) (JavaRDDLike.scala:143) (Iterator.scala:434) (Iterator.scala:440) (ExternalSorter.scala:191) (SortShuffleWriter.scala:63) (ShuffleMapTask.scala:96) (ShuffleMapTask.scala:53) (Task.scala:108) (Executor.scala:335) (ThreadPoolExecutor.java:1142) (ThreadPoolExecutor.java:617) (Thread.java:745)
最新发布
11-27
`java.lang.ArrayIndexOutOfBoundsException: 1` 异常表明程序试图访问数组中索引为 1 的元素,但该数组的长度可能小于 2,即数组越界访问,通常发生在试图访问的索引值小于 0 或大于或等于数组的实际长度时[^2]。在提供的代码中,问题可能出现在 `line.split(" ")` 之后,若某一行数据没有分隔符或者分隔后数组长度不足 2,就会引发此异常。 以下是修改后的代码,添加了对数组长度的检查: ```java package step2; import org.apache.spark.SparkConf; import org.apache.spark.api.java.JavaPairRDD; import org.apache.spark.api.java.JavaRDD; import org.apache.spark.api.java.JavaSparkContext; import scala.Tuple2; import java.util.Arrays; import java.util.List; public class JTeachers { public static void main(String[] args) { SparkConf conf = new SparkConf().setMaster("local").setAppName("JTeachers"); JavaSparkContext sc = new JavaSparkContext(conf); String dataFile = "file:///root/step2_files"; // 第一步:以外部文件方式创建RDD JavaRDD<String> lines = sc.textFile(dataFile); // 第二步:将文件中每行的数据切分,得到自己想要的返回值 JavaPairRDD<String, Integer> pairs = lines.flatMapToPair(line -> { String[] parts = line.split(" "); if (parts.length >= 2) { try { return Arrays.asList(new Tuple2<>(parts[0], Integer.parseInt(parts[1]))).iterator(); } catch (NumberFormatException e) { // 处理无法转换为整数的情况 return Arrays.asList().iterator(); } } return Arrays.asList().iterator(); }); // 第三步:将相同的key进行聚合 JavaPairRDD<String, Integer> result = pairs.reduceByKey((x, y) -> x + y); // 第四步:将结果收集起来 List<Tuple2<String, Integer>> output = result.collect(); // 第五步:输出 for (Tuple2<String, Integer> tuple : output) { System.out.println("(" + tuple._1() + "," + tuple._2() + ")"); } sc.stop(); } } ``` 在上述代码中,添加了对 `parts` 数组长度的检查,确保其长度至少为 2 才进行后续操作。同时,使用 `try-catch` 块捕获 `NumberFormatException` 异常,处理无法将字符串转换为整数的情况。
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值