0.前提
系统已经安装了:JDK Scala Hadoop
spark的安装文件:spark-1.2.0-bin-hadoop2.4.tgz
1.解压安装包
caiyong@caiyong:~/setup$ sudo tar -zxvf spark-1.2.0-bin-hadoop2.4.tgz -C /opt/caiyong@caiyong:/opt$ sudo mv spark-1.2.0-bin-hadoop2.4/ spark
2.设置环境变量
在/etc/profile文件中添加如下代码:
#spark environment
export SPARK_HOME=/opt/sparkexport PATH=$PATH:$SPARK_HOME/bin
caiyong@caiyong:/opt$ source /etc/profile
3.添加SCALA_HOME
caiyong@caiyong:/opt/spark/conf$ cp spark-env.sh.template spark-env.shcaiyong@caiyong:/opt/spark/conf$ gedit spark-env.sh
在spqrk-env.sh中添加scala的安装路径:
export SCALA_HOME=/opt/scala
4.添加slave
caiyong@caiyong:/opt/spark/conf$ gedit slaves# A Spark Worker will be started on each of the machines listed below.
localhost
5.启动spark
caiyong@caiyong:/opt/spark$ sbin/start-all.shstarting org.apache.spark.deploy.master.Master, logging to /opt/spark/sbin/../logs/spark-caiyong-org.apache.spark.deploy.master.Master-1-caiyong.out
localhost: starting org.apache.spark.deploy.worker.Worker, logging to /opt/spark/sbin/../logs/spark-caiyong-org.apache.spark.deploy.worker.Worker-1-caiyong.out
caiyong@caiyong:/opt/spark$ jps
12354 Worker
12429 Jps
12146 Master
6.查看web UI
浏览master的web UI(默认http://localhost:8080)
7.停止 Spark
caiyong@caiyong:/opt/spark$ sbin/stop-all.sh