基于docker的Hadoop-spark集群安装(centos 7系统)
参考博客:https://zhuanlan.zhihu.com/p/421375012
推荐(个人电脑用作实验)配置:4~8G + 8核
使用操作系统 Centos 7,在VMware下进行的安装
步骤:
- 拉取大佬弄好的hadoop+spark的docker镜像:docker pull s1mplecc/spark-hadoop:3
- 编写docker-compose.yml,共享目录/opt/share到物理机 ~/docker/spark/share,方便修改和传文件
version: '2'
services:
spark:
image: s1mplecc/spark-hadoop:3
hostname: master
environment:
- SPARK_MODE=master
- SPARK_RPC_AUTHENTICATION_ENABLED=no
- SPARK_RPC_ENCRYPTION_ENABLED=no
- SPARK_LOCAL_STORAGE_ENCRYPTION_ENABLED=no
- SPARK_SSL_ENABLED=no
volumes:
- ~/docker/spark/share:/opt/share
ports:
- '8080:8080'
- '4040:4040'
- '8088:8088'
- '8042:8042'
- '9870:9870'
- '19888:19888