/usr/local/spark/sbin/start-all.sh

本文详细介绍了如何使用`/usr/local/spark/sbin/start-all.sh`命令启动Spark集群,包括集群启动过程、相关配置解析以及常见问题排查。通过阅读,读者将了解启动Spark集群的完整步骤和注意事项。
data2: failed to launch: nice -n 0 /usr/local/spark/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://master:7077
hduser@master:/usr/local/spark/logs$ ls
spark-hduser-org.apache.spark.deploy.master.Master-1-master.out
spark-hduser-org.apache.spark.deploy.master.Master-1-master.out.1
spark-hduser-org.apache.spark.deploy.master.Master-1-master.out.2
spark-hduser-org.apache.spark.deploy.master.Master-1-master.out.3
spark-hduser-org.apache.spark.deploy.master.Master-1-master.out.4
spark-hduser-org.apache.spark.deploy.master.Master-1-master.out.5
hduser@master:/usr/local/spark/logs$ vim spark-hduser-org.apache.spark.deploy.master.Master-1-master.out
hduser@master:/usr/local/spark/logs$ cat spark-hduser-org.apache.spark.deploy.master.Master-1-master.out
Spark Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -cp /usr/local/spark/conf/:/usr/local/spark/jars/*:/usr/local/hadoop/etc/hadoop/ -Xmx1g org.apache.spark.deploy.master.Master --host master --port 7077 --webui-port 8080
========================================
18/06/26 09:09:31 WARN MasterArguments: SPARK_MASTER_IP is deprecated, please use SPARK_MASTER_HOST
18/06/26 09:09:44 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
hduser@master:/usr/local/spark/logs$ 
18/06/26 09:38:39 WARN TaskSchedulerImpl: Initial job has not accepted any resources; check your cluster UI to ensure that workers are registered and have sufficient resources
[root@master ~]# /usr/local/spark/sbin/start-all.sh starting org.apache.spark.deploy.master.Master, logging to /usr/local/spark/logs/spark-root-org.apache.spark.deploy.master.Master-1-master.out slave1: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local/spark/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-slave1.out slave2: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local/spark/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-slave2.out slave1: failed to launch: nice -n 0 /usr/local/spark/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://master:7077 slave2: failed to launch: nice -n 0 /usr/local/spark/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://master:7077 slave1: Spark Command: /usr/local/jdk/bin/java -cp /usr/local/spark/conf/:/usr/local/spark/jars/*:/usr/local/hadoop/etc/hadoop/:/usr/local/hive/lib/ -Xmx1g org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://master:7077 slave1: ======================================== slave1: full log in /usr/local/spark/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-slave1.out slave2: Spark Command: /usr/local/jdk/bin/java -cp /usr/local/spark/conf/:/usr/local/spark/jars/*:/usr/local/hadoop/etc/hadoop/:/usr/local/hive/lib/ -Xmx1g org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://master:7077 slave2: ======================================== slave2: full log in /usr/local/spark/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-slave2.out 我哪里有spark-class
07-29
[root@master conf]# /usr/local/spark/sbin/start-all.sh starting org.apache.spark.deploy.master.Master, logging to /usr/local/spark/logs/spark-root-org.apache.spark.deploy.master.Master-1-master.out slave1: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local/spark/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-slave1.out slave2: starting org.apache.spark.deploy.worker.Worker, logging to /usr/local/spark/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-slave2.out slave1: failed to launch: nice -n 0 /usr/local/spark/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://master:7077 slave2: failed to launch: nice -n 0 /usr/local/spark/bin/spark-class org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://master:7077 slave1: Spark Command: /usr/local/jdk/bin/java -cp /usr/local/spark/conf/:/usr/local/spark/jars/*:/usr/local/hadoop/etc/hadoop/:/usr/local/hive/lib/ -Xmx1g org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://master:7077 slave1: ======================================== slave1: full log in /usr/local/spark/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-slave1.out slave2: Spark Command: /usr/local/jdk/bin/java -cp /usr/local/spark/conf/:/usr/local/spark/jars/*:/usr/local/hadoop/etc/hadoop/:/usr/local/hive/lib/ -Xmx1g org.apache.spark.deploy.worker.Worker --webui-port 8081 spark://master:7077 slave2: ======================================== slave2: full log in /usr/local/spark/logs/spark-root-org.apache.spark.deploy.worker.Worker-1-slave2.out
07-29
评论
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值