HDFS上传文件报错org.apache.hadoop.fs.ChecksumException: Checksum error: file:/hyk/data/hyk.txt

在尝试将本地文件hyk.txt上传到HDFS时遇到了ChecksumException,错误信息显示校验和错误。Hadoop客户端在上传时会检查.crc校验文件,如果校验失败则阻止文件上传。为解决问题,只需重新从本地上传文件即可成功。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

当从本地上传文件到HDFS中时报错
fs.FSInputChecker: Found checksum error: b[0, 69]=6d6f77656968616861686168616868616686168616861686861680a
org.apache.hadoop.fs.ChecksumException: Checksum error: file:/hyk/data/hyk.txt

[root@node01 data]# hadoop fs -put hyk.txt /hyk/test
20/02/18 12:54:39 INFO fs.FSInputChecker: Found checksum error: b[0, 69]=6d6f77656968616861686168616868616686168616861686861680a
org.apache.hadoop.fs.ChecksumException: Checksum error: file:/hyk/data/hyk.txt at 0 exp: 599154106 got: -6
	at org.apache.hadoop.fs.FSInputChecker.verifySums(FSInputChecker.java:323)
	at org.apache.hadoop.fs.FSInputChecker.readChecksumChunk(FSInputChecker.java:279)
	at org.apache.hadoop.fs.FSInputChecker.read1(FSInputChecker.java:228)
	at org.apache.hadoop.fs.FSInputChecker.read(FSInputChecker.java:196)
	at java.io.DataInputStream.read(DataInputStream.java:100)
	at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:86)
	at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:60)
	at org.apache.hadoop.io.IOUtils.copyBytes(IOUtils.java:120)
	at org.apache.hadoop.fs.shell.CommandWithDestination$TargetFileSystem.writeStreamToFile(CommandWit
	at org.apache.hadoop.fs.shell.CommandWithDestination.copyStreamToTarget(CommandWithDestination.jav
	at org.apache.hadoop.fs.shell.CommandWithDestination.copyFileToTarget(CommandWithDestination.java:
	at org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestination.java:263)
	at org.apache.hadoop.fs.shell.CommandWithDestination.processPath(CommandWithDestination.java:248)
	at org.apache.hadoop.fs.shell.Command.processPaths(Command.java:317)
	at org.apache.hadoop.fs.shell.Command.processPathArgument(Command.java:289)
	at org.apache.hadoop.fs.shell.CommandWithDestination.processPathArgument(CommandWithDestination.ja
	at org.apache.hadoop.fs.shell.Command.processArgument(Command.java:271)
	at org.apache.hadoop.fs.shell.Command.processArguments(Command.java:255)
	at org.apache.hadoop.fs.shell.CommandWithDestination.processArguments(CommandWithDestination.java:
	at org.apache.hadoop.fs.shell.CopyCommands$Put.processArguments(CopyCommands.java:267)
	at org.apache.hadoop.fs.shell.Command.processRawArguments(Command.java:201)
	at org.apache.hadoop.fs.shell.Command.run(Command.java:165)
	at org.apache.hadoop.fs.FsShell.run(FsShell.java:287)
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
	at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
	at org.apache.hadoop.fs.FsShell.main(FsShell.java:340)
put: Checksum error: file:/hyk/data/hyk.txt at 0 exp: 599154106 got: -685625431

hadoop客户端将本地文件hyk.txt上传到集群上时,会通过fs.FSInputChecker判断需要上传的文件是否存在.crc校验文件。

如果存在.crc校验文件,则会进行校验。如果校验失败,就不会上传该文件.

解决办法如下:

# 查看是否存在.hyk.txt.crc
[root@node01 data]# ls -a
.  ..  hyk.txt  .hyk.txt.crc  hyw.txt
# 删除.hyk.txt.crc文件
[root@node01 data]# rm .hyk.txt.crc 
rm:是否删除普通文件 ".hyk.txt.crc"?y
[root@node01 data]# 

再次从本地上传文件到HDFS中去就可以了

评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值