安装hadoop 问题记录

1.安装hadoop 报错

目录

1.安装hadoop 报错

2.hadoop 实例运行

 

3.总结


1.启动bin/hdfs namenode -format  命令后包如下错误 SHUTDOWN_MSG: Shutting down NameNode at huwei.local/192.168.1.100

2021-01-24 00:09:11,446 INFO common.Storage: Storage directory /usr/local/Cellar/hadoop/3.3.0/name has been successfully formatted.
2021-01-24 00:09:11,467 INFO namenode.FSImageFormatProtobuf: Saving image file /usr/local/Cellar/hadoop/3.3.0/name/current/fsimage.ckpt_0000000000000000000 using no compression
2021-01-24 00:09:11,549 INFO namenode.FSImageFormatProtobuf: Image file /usr/local/Cellar/hadoop/3.3.0/name/current/fsimage.ckpt_0000000000000000000 of size 400 bytes saved in 0 seconds .
2021-01-24 00:09:11,555 INFO namenode.NNStorageRetentionManager: Going to retain 1 images with txid >= 0
2021-01-24 00:09:11,558 INFO namenode.FSImage: FSImageSaver clean checkpoint: txid=0 when meet shutdown.
2021-01-24 00:09:11,558 INFO namenode.NameNode: SHUTDOWN_MSG: 
/************************************************************
SHUTDOWN_MSG: Shutting down NameNode at huwei.local/192.168.1.100
************************************************************/

 

2.启动yarn时候报Cannot set priority of resourcemanager process 42801 错误,找了很多博客没有解决办法,最后自己看日志log发现了问题

MacBook-Pro 3.3.0 % libexec/sbin/start-yarn.sh 
Starting resourcemanager
ERROR: Cannot set priority of resourcemanager process 42801

日志vim /usr/local/Cellar/hadoop/3.3.0/libexec/logs/hadoop-***-resourcemanager-****.local.log  文件 给我定位到如下问题,发现是yarn-site.xml文件yarn.resourcemanager.address 配置的有问题,于是把端口号去掉.localhost:8088  -> localhost.

tips: 查看后缀为.log的文件即可,后缀为.out的文件一般不用看

重新启动后,没有报错,说明启动成功了

MacBook-Pro 3.3.0 % libexec/sbin/start-yarn.sh
Starting resourcemanager
ERROR: Cannot set priority of resourcemanager process 55002
Starting nodemanagers
MacBook-Pro 3.3.0 % libexec/sbin/start-yarn.sh
Starting resourcemanager
Starting nodemanagers
MacBook-Pro 3.3.0 % jps
42146 DataNode
82370 NodeManager
1475 Launcher
42307 SecondaryNameNode
42005 NameNode
82471 Jps
82232 ResourceManager
1034 

查看hdfs 的界面 http://localhost:9870/ 

yarn的web UI 界面 http://localhost:8088/cluster

2.hadoop 实例运行

上传文件

MacBook-Pro 3.3.0 % hadoop fs -put /usr/local/Cellar/hadoop/3.3.0/tmp/test/input.txt hdfs://localhost:9000/

查看文件

MacBook-Pro 3.3.0 % hadoop fs -ls  hdfs://localhost:9000/
2021-01-24 18:20:41,082 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Found 1 items
-rw-r--r--   1 huwei supergroup        623 2021-01-24 18:20 hdfs://localhost:9000/input.txt

hdfs是没有问题了,测试MapReduce

1.运行Wordcount 

huwei@huwei 3.3.0 % hadoop jar libexec/share/hadoop/mapreduce/hadoop-mapreduce-examples-3.3.0.jar wordcount hdfs://localhost:9000/input.txt  hdfs://localhost:9000/output/wordcount1.out

发现找不到主类错误,解决方法 https://my.oschina.net/u/4257185/blog/3545368  很详细

ailing this attempt.Diagnostics: [2021-01-24 18:28:17.421]Exception from container-launch.
Container id: container_1611481682614_0001_02_000001
Exit code: 1

[2021-01-24 18:28:17.423]Container exited with a non-zero exit code 1. Error file: prelaunch.err.
Last 4096 bytes of prelaunch.err :
Last 4096 bytes of stderr :
错误: 找不到或无法加载主类 org.apache.hadoop.mapreduce.v2.app.MRAppMaster


[2021-01-24 18:28:17.424]Container exited with a non-zero exit code 1. Error file: prelaunch.err.
Last 4096 bytes of prelaunch.err :
Last 4096 bytes of stderr :
错误: 找不到或无法加载主类 org.apache.hadoop.mapreduce.v2.app.MRAppMaster

运行后查看hdfs的文件

hadoop fs -ls  hdfs://localhost:9000/output

drwxr-xr-x   - huwei supergroup          0 2021-01-24 19:15 hdfs://localhost:9000/output/wordcount1.out


拉到本地环境

huwei@huweideMacBook-Pro 3.3.0 % hadoop fs -get  hdfs://localhost:9000/output/wordcount1.out   /usr/local/Cellar/hadoop/3.3.0/tmp/test/ 


在本地查看结果文件

huwei@huwei libexec % vim /usr/local/Cellar/hadoop/3.3.0/tmp/test/wordcount1.out/part-r-00000

结果:

未解决问题:同一个命令 hadoop classpath 在不同的窗口会正确和错误.

-- 错误的窗口
3.3.0 % hadoop
WARNING: log4j.properties is not found. HADOOP_CONF_DIR may be incomplete.
ERROR: Invalid HADOOP_COMMON_HOME


-- 正确的窗口
 libexec % hadoop classpath
/usr/local/Cellar/hadoop/3.3.0/libexec/etc/hadoop:/usr/local/Cellar/hadoop/3.3.0/libexec/share/hadoop/common/lib/*:/usr/local/Cellar/hadoop/3.3.0/libexec/share/hadoop/common/*:/usr/local/Cellar/hadoop/3.3.0/libexec/share/hadoop/hdfs:/usr/local/Cellar/hadoop/3.3.0/libexec/share/hadoop/hdfs/lib/*:/usr/local/Cellar/hadoop/3.3.0/libexec/share/hadoop/hdfs/*:/usr/local/Cellar/hadoop/3.3.0/libexec/share/hadoop/mapreduce/*:/usr/local/Cellar/hadoop/3.3.0/libexec/share/hadoop/yarn:/usr/local/Cellar/hadoop/3.3.0/libexec/share/hadoop/yarn/lib/*:/usr/local/Cellar/hadoop/3.3.0/libexec/share/hadoop/yarn/*

3.总结

1、必须熟悉linux命令,熟悉linux的时间很长时间,很多命令都需要去网上查询,不能有熟练的记住;

2、需要学会看官方文档和系统日志,安装过程中的步骤很多,会出现许多问题,都需要在日志中找到症结,然后再去搜blog,效率会高很多.

 

4. mysql

zsh: command not found: service 解决方案

使用 Homebrew 提供的 services 替代

如 $ brew services start mysql

5.hive 

 

 

参考链接:

1.https://blog.youkuaiyun.com/lidongmeng0213/article/details/105495250

2.https://blog.youkuaiyun.com/xianglingchuan/article/details/86674594

3.https://blog.youkuaiyun.com/xiaozhuangyumaotao/article/details/106010114

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值