由于spark依赖于hadoop的hdfs文件系统;所以得安装hadoop;
1.jdk安装;
2.hadoop安装(https://www.cnblogs.com/gyouxu/p/4183417.html)
2.1:ssh安装;(版本不一致问题参考https://blog.youkuaiyun.com/woshiliulei0/article/details/51861805)
3.spark安装;(参考网址https://blog.youkuaiyun.com/weixin_36394852/article/details/76030317)
4.scala安装;(参考网址https://blog.youkuaiyun.com/weixin_36394852/article/details/75948991)