-- process information unavailable(不一定适用大家)

博客详细记录了在Hadoop分布式部署过程中遇到的问题,包括因SSH免密登录配置不当导致的连接失败,以及异常进程的查找与清除方法。通过一系列步骤,如使用jps、ps-ef和grep命令定位问题,最终成功解决了datanode启动异常的问题。

本人出现这句话的来源是:
在hadoop装分布式操作时(hadoop102,103,104),由于配置ssh免密登录时,三台机器用的不是相同用户名,导致在start-dfs.sh时,hadoop102连不上其它机器,还自己不主动结束,就ctal+c强行结束;
也不知道为什么在hadoop103和104已经启动了datanode,为了省事 直接适用了kill -9 12610 把datanode给杀死了;
在104上适用jps,正常显示,在103上就一直出现这句话,重启机器也没用。

解决办法:
1.找到进程号jps

2.ps -ef|grep pid 是否存在

3.假如不存在,我们可以去该/tmp/hsperfdata_用户名(这个是指执行java进程的用户) 去删除。

4.假如存在,当前用户查看就是process information unavailable ,则去切换用户去查看。
原文链接:https://blog.youkuaiyun.com/xjp8587/article/details/80549409
具体这个方法什么意思,我这个小白也不太懂,这个出毛病的pid是12610,按照2这个命令,找到了我的这个出毛病它是在我laybon用户名下操作的,然后我切换到我这个用户名下,去jps查看,发现没有12610这个有毛病的进程了,再切回root之后,也没有这个进程了。

本人的情况就是这样,也不知道为什么,稀奇古怪,希望本人的经验能对其它小白碰巧有用。

[root@slave1 ~]# cd $HADOOP_HOME [root@slave1 hadoop-3.1.4]# sbin/start-dfs.sh WARNING: HADOOP_SECURE_DN_USER has been replaced by HDFS_DATANODE_SECURE_USER. Using value of HADOOP_SECURE_DN_USER. Starting namenodes on [master] 上一次登录:六 11月 8 01:49:01 CST 2025:0 上 WARNING: HADOOP_SECURE_DN_USER has been replaced by HDFS_DATANODE_SECURE_USER. Using value of HADOOP_SECURE_DN_USER. master: WARNING: HADOOP_SECURE_DN_USER has been replaced by HDFS_DATANODE_SECURE_USER. Using value of HADOOP_SECURE_DN_USER. master: namenode is running as process 12174. Stop it first. Starting datanodes ERROR: Refusing to run as root: ROOT account is not found. Aborting. Starting secondary namenodes [master] 上一次登录:六 11月 8 01:51:22 CST 2025pts/0 上 WARNING: HADOOP_SECURE_DN_USER has been replaced by HDFS_DATANODE_SECURE_USER. Using value of HADOOP_SECURE_DN_USER. master: WARNING: HADOOP_SECURE_DN_USER has been replaced by HDFS_DATANODE_SECURE_USER. Using value of HADOOP_SECURE_DN_USER. master: secondarynamenode is running as process 12365. Stop it first. WARNING: HADOOP_SECURE_DN_USER has been replaced by HDFS_DATANODE_SECURE_USER. Using value of HADOOP_SECURE_DN_USER. [root@slave1 hadoop-3.1.4]# sbin/start-yarn.sh Starting resourcemanager 上一次登录:六 11月 8 01:51:26 CST 2025pts/0 上 Starting nodemanagers 上一次登录:六 11月 8 01:51:32 CST 2025pts/0 上 slave2: ssh: connect to host slave2 port 22: No route to host slave3: ssh: connect to host slave3 port 22: No route to host [root@slave1 hadoop-3.1.4]# sbin/mr-jobhistory-daemon.sh start historyserver WARNING: Use of this script to start the MR JobHistory daemon is deprecated. WARNING: Attempting to execute replacement "mapred --daemon start" instead. [root@slave1 hadoop-3.1.4]# jps 3425 NodeManager 3713 Jps 11486 -- process information unavailable [root@slave1 hadoop-3.1.4]# ^C [root@slave1 hadoop-3.1.4]# jps 3425 NodeManager 4164 Jps 11486 -- process information unavailable [root@slave1 hadoop-3.1.4]# # 从root用户切换到普通用户localhost [root@slave1 hadoop-3.1.4]# su - localhost 上一次登录:六 11月 8 01:29:03 CST 2025从 slave1pts/3 上 [localhost@slave1 ~]$ # 进入Hadoop目录 [localhost@slave1 ~]$ cd $HADOOP_HOME [localhost@slave1 hadoop-3.1.4]$ [localhost@slave1 hadoop-3.1.4]$ # 先停止所有服务(避免残留进程) [localhost@slave1 hadoop-3.1.4]$ sbin/stop-all.sh WARNING: Stopping all Apache Hadoop daemons as localhost in 10 seconds. WARNING: Use CTRL-C to abort. WARNING: HADOOP_SECURE_DN_USER has been replaced by HDFS_DATANODE_SECURE_USER. Using value of HADOOP_SECURE_DN_USER. Stopping namenodes on [master] WARNING: HADOOP_SECURE_DN_USER has been replaced by HDFS_DATANODE_SECURE_USER. Using value of HADOOP_SECURE_DN_USER. master: WARNING: HADOOP_SECURE_DN_USER has been replaced by HDFS_DATANODE_SECURE_USER. Using value of HADOOP_SECURE_DN_USER. Stopping datanodes WARNING: HADOOP_SECURE_DN_USER has been replaced by HDFS_DATANODE_SECURE_USER. Using value of HADOOP_SECURE_DN_USER. slave2: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password). slave3: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password). slave1: WARNING: HADOOP_SECURE_DN_USER has been replaced by HDFS_DATANODE_SECURE_USER. Using value of HADOOP_SECURE_DN_USER. slave1: ERROR: User defined in HDFS_DATANODE_SECURE_USER (yarn) does not exist. Aborting. Stopping secondary namenodes [master] WARNING: HADOOP_SECURE_DN_USER has been replaced by HDFS_DATANODE_SECURE_USER. Using value of HADOOP_SECURE_DN_USER. master: WARNING: HADOOP_SECURE_DN_USER has been replaced by HDFS_DATANODE_SECURE_USER. Using value of HADOOP_SECURE_DN_USER. WARNING: HADOOP_SECURE_DN_USER has been replaced by HDFS_DATANODE_SECURE_USER. Using value of HADOOP_SECURE_DN_USER. Stopping nodemanagers ERROR: nodemanager can only be executed by root. Stopping resourcemanager ERROR: resourcemanager can only be executed by root. [localhost@slave1 hadoop-3.1.4]$ [localhost@slave1 hadoop-3.1.4]$ # 重新启动HDFS(重点解决DataNode启动问题) [localhost@slave1 hadoop-3.1.4]$ sbin/start-dfs.sh WARNING: HADOOP_SECURE_DN_USER has been replaced by HDFS_DATANODE_SECURE_USER. Using value of HADOOP_SECURE_DN_USER. Starting namenodes on [master] WARNING: HADOOP_SECURE_DN_USER has been replaced by HDFS_DATANODE_SECURE_USER. Using value of HADOOP_SECURE_DN_USER. master: WARNING: HADOOP_SECURE_DN_USER has been replaced by HDFS_DATANODE_SECURE_USER. Using value of HADOOP_SECURE_DN_USER. master: ERROR: Unable to write in /usr/local/hadoop-3.1.4/logs. Aborting. Starting datanodes WARNING: HADOOP_SECURE_DN_USER has been replaced by HDFS_DATANODE_SECURE_USER. Using value of HADOOP_SECURE_DN_USER. slave3: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password). slave2: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password). slave1: WARNING: HADOOP_SECURE_DN_USER has been replaced by HDFS_DATANODE_SECURE_USER. Using value of HADOOP_SECURE_DN_USER. slave1: ERROR: User defined in HDFS_DATANODE_SECURE_USER (yarn) does not exist. Aborting. Starting secondary namenodes [master] WARNING: HADOOP_SECURE_DN_USER has been replaced by HDFS_DATANODE_SECURE_USER. Using value of HADOOP_SECURE_DN_USER. master: WARNING: HADOOP_SECURE_DN_USER has been replaced by HDFS_DATANODE_SECURE_USER. Using value of HADOOP_SECURE_DN_USER. master: ERROR: Unable to write in /usr/local/hadoop-3.1.4/logs. Aborting. WARNING: HADOOP_SECURE_DN_USER has been replaced by HDFS_DATANODE_SECURE_USER. Using value of HADOOP_SECURE_DN_USER. [localhost@slave1 hadoop-3.1.4]$ [localhost@slave1 hadoop-3.1.4]$ # 查看slave1节点的进程(在slave1的localhost用户下) [localhost@slave1 hadoop-3.1.4]$ jps 5726 Jps [localhost@slave1 hadoop-3.1.4]$ # 在slave1的localhost用户下,查看DataNode日志 [localhost@slave1 hadoop-3.1.4]$ cat $HADOOP_HOME/logs/hadoop-localhost-datanode-slave1.log cat: /usr/local/hadoop-3.1.4/logs/hadoop-localhost-datanode-slave1.log: 没有那个文件或目录
最新发布
11-09
评论
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值