安装好namenode和datanode后,
1)格式化hdfs成功
namenode -format:格式化Hdfs文件系统
2)启动所有hadoop成功
如果启动datanode不成功,一般是因为防火墙原因,关闭防火墙即可
3)但是查看hdfs
hadoop dfsadmin -report :查看报告
显示连接失败
15/06/16 11:02:39 INFO ipc.Client: Retrying connect to server: /172.16.80.150:9000. Already tried 0 time(s).
15/06/16 11:02:40 INFO ipc.Client: Retrying connect to server: /172.16.80.150:9000. Already tried 1 time(s).
15/06/16 11:02:41 INFO ipc.Client: Retrying connect to server: /172.16.80.150:9000. Already tried 2 time(s).
15/06/16 11:02:42 INFO ipc.Client: Retrying connect to server: /172.16.80.150:9000. Already tried 3 time(s).
15/06/16 11:02:43 INFO ipc.Client: Retrying connect to server: /172.16.80.150:9000. Already tried 4 time(s).
15/06/16 11:02:44 INFO ipc.Client: Retrying connect to server: /172.16.80.150:9000. Already tried 5 time(s).
15/06/16 11:02:45 INFO ipc.Client: Retrying connect to server: /172.16.80.150:9000. Already tried 6 time(s).
15/06/16 11:02:46 INFO ipc.Client: Retrying connect to server: /172.16.80.150:9000. Already tried 7 time(s).
15/06/16 11:02:47 INFO ipc.Client: Retrying connect to server: /172.16.80.150:9000. Already tried 8 time(s).
15/06/16 11:02:48 INFO ipc.Client: Retrying connect to server: /172.16.80.150:9000. Already tried 9 time(s).
Bad connection to DFS... command aborted.
搞了几遍发现问题是:标红部分之前写的是ip,然后死活就是连接不上hdfs,改成localhost后连接成功,再检查,发现配置里写的ip用的是映射ip,不是本机ip,晕死,粗心大意的毛病不能再犯。大家以后配ip的时候记得至少看 一下ifconfig的地址吧
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!-- Put site-specific property overrides in this file. -->
<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>//你的namenode的配置,机器名加端口
<description>The name of the default file system. Either the literal string "local" or a host:port for DFS.</description>
</property>
</configuration>