Mac部署Hadoop环境

本文详细介绍了在MacOS Mojave 10.14环境下搭建Hadoop2.8.5的过程,包括配置openssl、JDK1.8、Maven3.5.3以及Hadoop的具体步骤。涵盖了从下载安装到配置环境变量,再到Hadoop各组件配置文件的修改,最后是启动与停止Hadoop服务的全过程。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

0 环境

  • macOS Mojave10.14
  • openssl
  • JDK1.8
  • Maven3.5.3
  • Hadoop2.8.5

1 openssl配置

export OPENSSL_ROOT_DIR=/usr/local/Cellar/openssl/1.0.2p
export OPENSSL_INCLUDE_DIR=/usr/local/Cellar/openssl/1.0.2p/include

2 JDK配置

2.1 下载JDK

版本地址:https://www.oracle.com/technetwork/java/javase/archive-139210.html

2.2 配置

vim ~/.bash_profile
JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_171.jdk/Contents/Home
export JAVA_HOME

3 Maven配置

3.1 下载Maven

下载地址:http://maven.apache.org/download.cgi
版本选择:上面连接页面找到Previous Realseas>>archives>>进入版本列表>>ParentDictory>>Maven-1 Maven-2 Maven-3可供选择;

3.2 配置Maven

vim ~/.bash_profile
M2_HOME=/Users/xindaqi/Library/apache-maven-3.5.3
export M2_HOME

4 Hadoop配置

4.1 下载Hadoop

下载Hadoop2.8.5:http://hadoop.apache.org/releases.html

4.2 Hadoop文件配置

4.2.1 解压

tar -zxvf Hadoop-2.8.5.tar -C yourpath

4.2.2 core-site.xml

<configuration>
	<!-- 指定hadoop运行时产生文件的目录 -->
	<property>
		<name>hadoop.tmp.dir</name>
		<value>/Users/xindaqi/hadoop-2.8.5/xdq</value>
	</property>
</configuration>

4.2.3 hdfs-site.xml

<configuration>
	<property>
		<!-- 指定hdfs保存数据副本的数量,包括自身,默认为3 -->
		<!-- 伪分布模式,值必须为1 -->
		<name>dfs.replication</name>
		<value>1</value>
	</property>

	<!-- 不是root用户也可以写文件到hdfs -->
	<property>
		<name>dfs.permissions</name>
		<!-- 关闭防火墙 -->
		<value>false</value>
	</property>

	<property>
		<!-- 指明fsimage保存路径 -->
		<name>dfs.namenode.name.dir</name>
		<value>file:/Users/xindaqi/hadoop-2.8.5/xdq/hdfs/name</value>
	</property>

	<property>
		<!-- 指定块级文件保存路径 -->
		<name>dfs.datanode.data.dir</name>
		<value>file:/Users/xindaqi/hadoop-2.8.5/xdq/hdfs/data</value>
	</property>
	<property>
		<!--节点服务启动地址-->
		<name>dfs.namenode.rpc-address</name>
		<value>localhost:8080</value>
	</property>

	<property>
		<name>dfs.namenode.secdonary.http-address</name>
		<value>loacalhost:9001</value>
	</property>

	<property>
		<name>dfs.webhdfs.enabled</name>
		<value>true</value>
	</property>
</configuration>

4.2.4 mapred-site.xml.template

<configuration>
	<property>
		<name>mapreduce.framework.name</name>
		<value>yarn</value>
	</property>

	<property>
		<name>mapreduce.admin.user.env</name>
	<value>HADOOP_MAPRED_HOME=$HADOOP_COMMON_HOME</value>
	</property>
	<property>
		<name>yarn.app.mapreduce.am.env</name>
		<value>HADOOP_MAPRED_HOME=$HADOOP_COMMON_HOME</value>
	</property>
	<property>
		<name>mapreduce.application.classpath</name>
		<value>
			/Users/xindaqi/hadoop-2.8.5/etc/hadoop,
			/Users/xindaqi/hadoop-2.8.5/share/hadoop/common/*,
			/Users/xindaqi/hadoop-2.8.5/share/hadoop/common/lib/*,
			/Users/xindaqi/hadoop-2.8.5/share/hadoop/hdfs/*,
			/Users/xindaqi/hadoop-2.8.5/share/hadoop/hdfs/lib/*,
			/Users/xindaqi/hadoop-2.8.5/share/hadoop/mapreduce/*,
			/Users/xindaqi/hadoop-2.8.5/share/hadoop/mapreduce/lib/*,
			/Users/xindaqi/hadoop-2.8.5/share/hadoop/yarn/*,
			/Users/xindaqi/hadoop-2.8.5/share/hadoop/yarn/lib/*
		</value>
	</property>
</configuration>

4.2.5 yarn-site.xml

<configuration>
<!-- Site specific YARN configuration properties -->
	<property>
		<name>yarn.resourcemanager.hostname</name>
		<value>localhost</value>
	</property>
	<property>
		<!-- NodeManager获取数据方式 -->
		<name>yarn.nodemanager.aux-services</name>
		<value>mapreduce_shuffle</value>
	</property>
	<property>
		<name>mapreduce.application.classpath</name>
		<value>
			/Users/xindaqi/hadoop-2.8.5/etc/hadoop,
			/Users/xindaqi/hadoop-2.8.5/share/hadoop/common/*,
			/Users/xindaqi/hadoop-2.8.5/share/hadoop/common/lib/*,
			/Users/xindaqi/hadoop-2.8.5/share/hadoop/hdfs/*,
			/Users/xindaqi/hadoop-2.8.5/share/hadoop/hdfs/lib/*,
			/Users/xindaqi/hadoop-2.8.5/share/hadoop/mapreduce/*,
			/Users/xindaqi/hadoop-2.8.5/share/hadoop/mapreduce/lib/*,
			/Users/xindaqi/hadoop-2.8.5/share/hadoop/yarn/*,
			/Users/xindaqi/hadoop-2.8.5/share/hadoop/yarn/lib/*
		</value>
	</property>
</configuration>

4.2.6 hadoop-env.sh配置

export JAVA_HOME=${JAVA_HOME}

4.2.7 log4j.properties

log4j.logger.org.apache.hadoop.util.NativeCodeLoader=ERROR

4.3 系统环境配置

export HADOOP_HOME=/Users/xindaqi/hadoop-2.8.5
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
export LD_LIBRARY_PATH=$HADOOP_HOME/lib/native
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib/native:$HADOOP_COMMON_LIB_NATIVE_DIR"

4.4 ssh配置

开启远程登录:偏好设置 >> 共享>> 远程登录(勾选)

ssh-keygen -t rsa
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys
# 生成秘钥,后续输入本机密码即可

5 Hadoop操作

5.1 启动与停止

# 格式化namenode
hadoop namenode -format
# 启动dfs
./start-dfs.sh
# 停止dfs
./stop-dfs.sh
# 启动yarn
./start-yarn.sh
# 停止yarn
./stop-yarn.sh
# 全部启动
./start-all.sh
# 全部停止
./stop-all.sh

5.2 展示

# hdfs
localhost:50070

overview

图5.1 hdfs展示
# hadoop
localhost:8088

hadoop

图5.2 hadoop展示

6 总结

  • 问题修复
    (1) rpc-address
Incorrect configuration: namenode address dfs.namenode.servicerpc-address or dfs.namenode.rpc-address is not configured

解决:

# hdfs-site.xml
<name>dfs.namenode.rpc-address</name>
<value>localhost:8080</value>

(2) 错误提醒不能载入native-hadoop

util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

解决:

# log4j.properties文件添加
log4j.logger.org.apache.hadoop.util.NativeCodeLoader=ERROR
  • 无需重新编译Hadoop-2.8.5-src源码;
  • 有问题请留言;

安装过程实在很曲折,亲测不用重新编译Hadoop-2.8.5-src源码,只需对应解决问题即可启动hadoop,报错,查看错误信息,进行修改,即可解决问题.

Ubuntu部署Hadoop参见:Ubuntu部署Hadoop环境


[参考文献]
[1]https://blog.youkuaiyun.com/chenxun_2010/article/details/78238251
[2]https://www.cnblogs.com/landed/p/6831758.html
[3]https://www.cnblogs.com/EnzoDin/p/6940448.html


评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包

打赏作者

天然玩家

坚持才能做到极致

¥1 ¥2 ¥4 ¥6 ¥10 ¥20
扫码支付:¥1
获取中
扫码支付

您的余额不足,请更换扫码支付或充值

打赏作者

实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值