Hadoop:java.io.IOException: Tmp directory

本文详细记录了在使用Hadoop进行PiEstimator测试时遇到的tmp目录已存在导致程序无法正常运行的问题,以及如何通过删除该目录来解决此问题的过程。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

今天在机器上搭建Hadoop测试,第一次运行:
bin/hadoop jar hadoop-examples-1.0.3.jar pi 10 100

的时候,是没有问题的,然后我stop-all.sh,修改了一些配置文件,
运行start-all.sh,然后再次运行上面的命令的时候,就出现了这样的情况,
错误信息如下:
shuumatoMacBook:hadoop-1.0.3 Vito$ bin/hadoop jar hadoop-examples-1.0.3.jar pi 10 100
Number of Maps = 10
Samples per Map = 100
2012-08-30 11:07:54.740 java[4131:1703] Unable to load realm info from SCDynamicStore
java.io.IOException: Tmp directory hdfs://localhost:8020/user/Vito/PiEstimator_TMP_3_141592654 already exists. Please remove it first.
at org.apache.hadoop.examples.PiEstimator.estimate(PiEstimator.java:270)
at org.apache.hadoop.examples.PiEstimator.run(PiEstimator.java:342)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.hadoop.examples.PiEstimator.main(PiEstimator.java:351)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:64)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.hadoop.util.RunJar.main(RunJar.java:156)


字面意思上就明白了,tmp已经创建,现在你需要删除它,那我们就删除它吧:
bin/hadoop fs -rmr hdfs://localhost:8020/user/Vito/PiEstimator_TMP_3_141592654

再次运行测试,成功了。
2025-03-22 21:22:25,122 WARN namenode.NameNode: Encountered exception during format java.io.IOException: Cannot create directory /opt/hadoop/tmp/dfs/name/current at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:447) at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:591) at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:613) at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:189) at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1285) at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1733) at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1841) 2025-03-22 21:22:25,166 INFO namenode.FSNamesystem: Stopping services started for active state 2025-03-22 21:22:25,167 INFO namenode.FSNamesystem: Stopping services started for standby state 2025-03-22 21:22:25,169 ERROR namenode.NameNode: Failed to start namenode. java.io.IOException: Cannot create directory /opt/hadoop/tmp/dfs/name/current at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:447) at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:591) at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:613) at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:189) at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1285) at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1733) at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1841) 2025-03-22 21:22:25,179 INFO util.ExitUtil: Exiting with status 1: java.io.IOException: Cannot create directory /opt/hadoop/tmp/dfs/name/current 2025-03-22 21:22:25,207 INFO namenode.NameNode: SHUTDOWN_MSG:
最新发布
03-23
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值