(转)参考: https://www.cnblogs.com/NextNight/p/6703362.html -->Nice
已有环境说明:
已安装hadoop 2.9.0 集群(安装过程见历史blog)
主节点操作:
1 安装Scala
1.1 安装包下载
Note: Starting version 2.0, Spark is built with Scala 2.11 by default.
Scala 2.10 users should download the Spark source package and build
with Scala 2.10 support.
因Spark 对Scala存在版本要求,安装Spark 2.4.0 下载 Scala 2.11即可
下载地址:https://www.scala-lang.org/download/2.11.12.html
选择网页下面的tgz下载
1.2 上传安装
上传至目录:/opt/nfs_share/software
创建scala 目录:mkdir -p /opt/scala
解压:
cd /opt/scala
tar -zxvf /opt/nfs_share/software/scala-2.11.12.tgz
1.3 配置环境变量
vim ~/.bash_profile
#末尾添加
#scala environment
export SCALA_HOME=/opt/scala/scala-2.11.12
export PATH=$PATH:$SCALA_HOME/bin
立即生效: source ~/.bash_profile
1.4 验证
scala -version