Scala安装
scala解压,放到/usr/local/scala,配置好环境变量即可。
Spark安装
cd /usr/local/spark/conf
cp spark-env.sh.template spark-env.sh
vi spark-env.sh
export JAVA_HOME=/usr/local/java
export SCALA_HOME=/usr/local/scala
export HADOOP_HOME=/usr/local/hadoop
export HADOOP_CONF_DIR=/usr/local/hadoop/etc/hadoop
同步配置
scp -r /usr/local/spark/conf root@sparkproject2:/usr/local/spark
scp -r /usr/local/spark/conf root@sparkproject3:/usr/local/spark
测试安装
/usr/local/spark/bin/spark-submit \
--class org.apache.spark.examples.JavaSparkPi \
--master yarn-client \
--num-executors 1 \
--driver-memory 512m \
--executor-memory 512m \
--executor-cores 1 \
/usr/local/spark/lib/spark-examples-1.5.1-hadoop2.4.0.jar \
本文详细介绍了Scala和Spark的安装过程,包括环境变量配置、Spark环境设置及跨机器同步配置。通过具体命令演示了如何在Linux环境下进行解压、配置及测试安装,确保集群运行正常。
1258

被折叠的 条评论
为什么被折叠?



