hadoop上传不了文件之后发现-Configured Capacity: 0 (0 B)

hadoop上传不了文件之后发现-Configured Capacity:为0。没有空间了。但是我的电脑空空如也,所以我
hdfs dfsadmin -report
查看hdfs是否真的有空间分配出去,但是Configured Capacity: 0 (0 B).上网百度之后,发现说是多次格式化导致容量分配为0.不过我只格式化过一次。所以问题肯定不在这,我又看了许多文章,发现也有说是host出错了。但是我的hosts文件也没有错。但是我发现我ping自己master,居然ping不通,但是IP也没有错。
顺着ip ping不同的方向,我发现宽带网ping不同,但是手机网就可以。很迷惑。
但是当你通过ssh登录到你的服务器之后忽悠这样一个界面
![在这里插入图片描述](https://img-blog.csdnimg.cn/20200304203653703.png?x-oss-process=image/watermark,type_ZmFuZ3poZW5naGVpdGk,shadow_10,text_aHR0cHM6Ly9ibG9nLmNzZG4ubmV0L3FxXzQxMDM3MDEy,size_16,color_FFFFFF,t_70
红圈圈住的是你上次登陆的ip,但是我ifconfig得到的ip是另外一个。
不过当你修改hosts文件为这个新的ip之后就可以了

[root@hadoop04 ~]# hadoop jar ./film2.jar CleanDriver /film/input /film/outputs/cleandata 25/03/15 15:11:24 INFO client.RMProxy: Connecting to ResourceManager at hadoop04/192.168.100.104:8032 Exception in thread "main" org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://hadoop04:9000/film/input already exists at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:146) at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:266) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:139) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1762) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308) at CleanDriver.main(CleanDriver.java:36) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.util.RunJar.run(RunJar.java:226) at org.apache.hadoop.util.RunJar.main(RunJar.java:141) [root@hadoop04 ~]# hdfs dfsadmin -report Configured Capacity: 54716792832 (50.96 GB) Present Capacity: 45412380672 (42.29 GB) DFS Remaining: 45412196352 (42.29 GB) DFS Used: 184320 (180 KB) DFS Used%: 0.00% Under replicated blocks: 0 Blocks with corrupt replicas: 0 Missing blocks: 0 Missing blocks (with replication factor 1): 0 ------------------------------------------------- Live datanodes (3): Name: 192.168.100.105:50010 (hadoop05) Hostname: hadoop05 Decommission Status : Normal Configured Capacity: 18238930944 (16.99 GB) DFS Used: 61440 (60 KB) Non DFS Used: 3007356928 (2.80 GB) DFS Remaining: 15231512576 (14.19 GB) DFS Used%: 0.00% DFS Remaining%: 83.51% Configured Cache Capacity: 0 (0 B) Cache Used: 0 (0 B) Cache Remaining: 0 (0 B) Cache Used%: 100.00% Cache Remaining%: 0.00% Xceivers: 1 Last contact: Sat Mar 15 15:11:46 CST 2025 Name: 192.168.100.104:50010 (hadoop04) Hostname: hadoop04 Decommission Status : Normal Configured Capacity: 18238930944 (16.99 GB) DFS Used: 61440 (60 KB) Non DFS Used: 3289276416 (3.06 GB) DFS Remaining: 14949593088 (13.92 GB) DFS Used%: 0.00% DFS Remaining%: 81.97% Configured Cache Capacity: 0 (0 B) Cache Used: 0 (0 B) Cache Remaining: 0 (0 B) Cache Used%: 100.00% Cache Remaining%: 0.00% Xceivers: 1 Last contact: Sat Mar 15 15:11:46 CST 2025 Name: 192.168.100.106:50010 (hadoop06) Hostname: hadoop06 Decommission Status : Normal Configured Capacity: 18238930944 (16.99 GB) DFS Used: 61440 (60 KB) Non DFS Used: 3007778816 (2.80 GB) DFS Remaining: 15231090688 (14.19 GB) DFS Used%: 0.00% DFS Remaining%: 83.51% Configured Cache Capacity: 0 (0 B) Cache Used: 0 (0 B) Cache Remaining: 0 (0 B) Cache Used%: 100.00% Cache Remaining%: 0.00% Xceivers: 1 Last contact: Sat Mar 15 15:11:46 CST 2025 [root@hadoop04 ~]#
最新发布
03-16
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值