一:修改hdfs-site.xml文件:
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
</configuration>
二:修改core-site.xml文件:
<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://172.17.152.57:9000</value>
</property>
<property>
<name>hadoop.tmp.dir</name>
<value>/data/hadoopData</value>
</property>
</configuration>
三:修改yarn-site.xml文件:
<configuration>
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
<!-- Site specific YARN configuration properties -->
</configuration>
四:在hadoop-env.sh新增jdk路径:
export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.201.b09-2.el7_6.x86_64
五:修改mapper-site文件:
<configuration>
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
</configuration>
六:修改start-yarn.sh文件(安装包sbin目录下):
YARN_RESOURCEMANAGER_USER=root
HADOOP_SECURE_DN_USER=yarn
YARN_NODEMANAGER_USER=root
七:修改start-dfs.sh文件(安装包sbin目录下):
HDFS_DATANODE_USER=root
HADOOP_SECURE_DN_USER=hdfs
HDFS_NAMENODE_USER=root
HDFS_SECONDARYNAMENODE_USER=root
八 :执行命令生成ssh密钥:
ssh-keygen -t rsa
cp id_rsa.pub authorized_keys