spark安装

本文详细介绍了如何在已安装 JDK、Scala 和 Hadoop 的系统上安装配置 Spark 1.2.0 版本。包括解压安装包、设置环境变量、添加 SCALA_HOME 和 slave 配置等步骤,并演示了启动和停止 Spark 的过程。

0.前提

系统已经安装了:JDK  Scala  Hadoop

spark的安装文件:spark-1.2.0-bin-hadoop2.4.tgz

1.解压安装包

caiyong@caiyong:~/setup$ sudo tar -zxvf spark-1.2.0-bin-hadoop2.4.tgz -C /opt/

caiyong@caiyong:/opt$ sudo mv spark-1.2.0-bin-hadoop2.4/ spark

2.设置环境变量

在/etc/profile文件中添加如下代码:

#spark environment

export SPARK_HOME=/opt/spark
export PATH=$PATH:$SPARK_HOME/bin


caiyong@caiyong:/opt$ source /etc/profile

3.添加SCALA_HOME

caiyong@caiyong:/opt/spark/conf$ cp spark-env.sh.template spark-env.sh

caiyong@caiyong:/opt/spark/conf$ gedit spark-env.sh
在spqrk-env.sh中添加scala的安装路径:
export SCALA_HOME=/opt/scala

4.添加slave

caiyong@caiyong:/opt/spark/conf$ gedit slaves

# A Spark Worker will be started on each of the machines listed below.
localhost

5.启动spark

caiyong@caiyong:/opt/spark$ sbin/start-all.sh
starting org.apache.spark.deploy.master.Master, logging to /opt/spark/sbin/../logs/spark-caiyong-org.apache.spark.deploy.master.Master-1-caiyong.out
localhost: starting org.apache.spark.deploy.worker.Worker, logging to /opt/spark/sbin/../logs/spark-caiyong-org.apache.spark.deploy.worker.Worker-1-caiyong.out
caiyong@caiyong:/opt/spark$ jps
12354 Worker
12429 Jps
12146 Master

6.查看web UI

浏览master的web UI(默认http://localhost:8080)



7.停止 Spark

caiyong@caiyong:/opt/spark$ sbin/stop-all.sh



评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值