idea执行mapreduce报错 Could not locate Hadoop executable: C:\hadoop-3.1.1\bin\winutils.exe

本文详细解析了在Windows环境下执行MapReduce时遇到的FileNotFoundException及UnsatisfiedLinkError问题,并提供了具体解决方案,包括winutils.exe和hadoop.dll的正确安装路径。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

  window执行mapreduce报错  

Exception in thread "main" java.lang.RuntimeException: java.io.FileNotFoundException: Could not locate Hadoop executable: C:\hadoop-3.1.1\bin\winutils.exe -see https://wiki.apache.org/hadoop/WindowsProblems
    at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:737)
    at org.apache.hadoop.util.Shell.getSetPermissionCommand(Shell.java:272)
    at org.apache.hadoop.util.Shell.getSetPermissionCommand(Shell.java:288)
    at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:840)
    at org.apache.hadoop.fs.RawLocalFileSystem.mkOneDirWithMode(RawLocalFileSystem.java:522)
    at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:562)
    at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:534)
    at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:561)
    at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:539)
    at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:332)
    at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:162)
    at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:113)
    at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:151)
    at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1570)
    at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1567)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1567)
    at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1588)
    at com.venn.demo.WordCount.main(WordCount.java:77)
Caused by: java.io.FileNotFoundException: Could not locate Hadoop executable: C:\hadoop-3.1.1\bin\winutils.exe -see https://wiki.apache.org/hadoop/WindowsProblems
    at org.apache.hadoop.util.Shell.getQualifiedBinInner(Shell.java:620)
    at org.apache.hadoop.util.Shell.getQualifiedBin(Shell.java:593)
    at org.apache.hadoop.util.Shell.<clinit>(Shell.java:690)
    at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:78)
    at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:3533)
    at org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:3528)
    at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3370)
    at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:477)
    at com.venn.util.HdfsUtil.isPathExist(HdfsUtil.java:18)
    at com.venn.util.HdfsUtil.handleOutPutPath(HdfsUtil.java:47)
    at com.venn.demo.WordCount.main(WordCount.java:28)

注: C:\hadoop-3.1.1 是配置的 HADOOP_HOME 路径

  下载对应版本的 winutils.exe,放到  路径:  C:\hadoop-3.1.1\bin\   下。

  然后,你会发现又有个新的bug。O(∩_∩)O哈哈~   

Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
    at org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Native Method)
    at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:640)
    at org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:1223)
    at org.apache.hadoop.util.DiskChecker.checkAccessByFileMethods(DiskChecker.java:160)
    at org.apache.hadoop.util.DiskChecker.checkDirInternal(DiskChecker.java:100)
    at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:77)
    at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.confChanged(LocalDirAllocator.java:315)
    at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathForWrite(LocalDirAllocator.java:378)
    at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:152)
    at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:133)
    at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:117)
    at org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:124)
    at org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:172)
    at org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:788)
    at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:254)
    at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1570)
    at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1567)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1567)
    at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1588)
    at com.venn.demo.WordCount.main(WordCount.java:77)

 

  再下载个对应版本的 hadoop.ddl ,放到  C:\Windows\System32  ,就搞定了

 

  一般来说, winutils.exe 和 hadoop.ddl 同一个大版本应该都可以

  

 

转载于:https://www.cnblogs.com/Springmoon-venn/p/10061109.html

STARTUP_MSG: host = 小明同学/192.168.3.104 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.3.4 STARTUP_MSG: classpath = D:\Hadoop\hadoop3.3.4\hadoop-3.3.4\etc\hadoop;D:\Hadoop\hadoop3.3.4\hadoop-3.3.4\share\hadoop\common;D:\Hadoop\hadoop3.3.4\hadoop-3.3.4\share\hadoop\common\lib\accessors-smart-2.4.7.jar;D:\Hadoop\hadoop3.3.4\hadoop-3.3.4\share\hadoop\common\lib\animal-sniffer-annotations-1.17.jar;D:\Hadoop\hadoop3.3.4\hadoop-3.3.4\share\hadoop\common\lib\asm-5.0.4.jar;D:\Hadoop\hadoop3.3.4\hadoop-3.3.4\share\hadoop\common\lib\audience-annotations-0.5.0.jar;D:\Hadoop\hadoop3.3.4\hadoop-3.3.4\share\hadoop\common\lib\avro-1.7.7.jar;D:\Hadoop\hadoop3.3.4\hadoop-3.3.4\share\hadoop\common\lib\checker-qual-2.5.2.jar;D:\Hadoop\hadoop3.3.4\hadoop-3.3.4\share\hadoop\common\lib\commons-beanutils-1.9.4.jar;D:\Hadoop\hadoop3.3.4\hadoop-3.3.4\share\hadoop\common\lib\commons-cli-1.2.jar;D:\Hadoop\hadoop3.3.4\hadoop-3.3.4\share\hadoop\common\lib\commons-codec-1.15.jar;D:\Hadoop\hadoop3.3.4\hadoop-3.3.4\share\hadoop\common\lib\commons-collections-3.2.2.jar;D:\Hadoop\hadoop3.3.4\hadoop-3.3.4\share\hadoop\common\lib\commons-compress-1.21.jar;D:\Hadoop\hadoop3.3.4\hadoop-3.3.4\share\hadoop\common\lib\commons-configuration2-2.1.1.jar;D:\Hadoop\hadoop3.3.4\hadoop-3.3.4\share\hadoop\common\lib\commons-daemon-1.0.13.jar;D:\Hadoop\hadoop3.3.4\hadoop-3.3.4\share\hadoop\common\lib\commons-io-2.8.0.jar;D:\Hadoop\hadoop3.3.4\hadoop-3.3.4\share\hadoop\common\lib\commons-lang3-3.12.0.jar;D:\Hadoop\hadoop3.3.4\hadoop-3.3.4\share\hadoop\common\lib\commons-logging-1.1.3.jar;D:\Hadoop\hadoop3.3.4\hadoop-3.3.4\share\hadoop\common\lib\commons-math3-3.1.1.jar;D:\Hadoop\hadoop3.3.4\hadoop-3.3.4\share\hadoop\common\lib\commons-net-3.6.jar;D:\Hadoop\hadoop3.3.4\hadoop-3.3.4\share\hadoop\common\lib\commons-text-1.4.jar;D:\Hadoop\hadoop3.3.4\hadoop-3.3.4\share\hadoop\common\lib\curator-client-4.2.0.jar;D:\Hadoop\hadoop3.3.4\hadoop-3.3.4\share\hadoop\common\lib\curator-framework-4.2.0.jar;D:\Hadoop\hadoo
最新发布
04-02
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值