hadoop本地模式运行mapreduce报错解决(Exception in thread main ExitCodeException exitCode=-1073741701)

本文解决在IDEA中使用Hadoop本地模式运行MapReduce任务时遇到的异常,包括修复winutils.exe文件和NativeIO类的问题,最终实现任务成功执行。

环境:

操作系统:Windows10
hadoop : hadoop 2.7.6
IDE : idea 2020.1
JDK : javac 1.8.0_191

问题:

在idea中使用hadoop本地模式运行一个mapreduce任务,抛出如下异常:

Exception in thread "main" ExitCodeException exitCode=-1073741701: 
at org.apache.hadoop.util.Shell.runCommand(Shell.java:545)
at org.apache.hadoop.util.Shell.run(Shell.java:456)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:722)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:815)
at org.apache.hadoop.util.Shell.execCommand(Shell.java:798)
at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:731)
at org.apache.hadoop.fs.RawLocalFileSystem.mkOneDirWithMode(RawLocalFileSystem.java:489)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:529)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:507)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:305)
at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:133)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:144)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1290)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1287)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1287)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)
at test.WordCount.main(WordCount.java:63)

遇到这个异常之后,在网上找了很多材料,大家都知道hadoop在windows上安装的时候需要在hadoop的安装目录的bin目录下替换hadoop.dll ,winutils.exe这两个文件(hadoop实在32位系统上编译的,因此在64位操作系统上需要添加这两个文件)。虽然我已经替换了,但是发现一个问题,双击winutils.exe文件之后,显示文件无法运行。

问题解决流程

①进而解决winutils.exe文件不能运行的问题:

在360安全卫士中下载DirectX修复工具,运行即可完成对winutils.exe文件的修复
在这里插入图片描述
完成以上修复后,再次运行任务,又出现下面的异常:

D:\software\it_software\jdk\jdk\bin\java.exe "-javaagent:D:\software\it_software\idea\idea2020\IntelliJ IDEA 2020.1\lib\idea_rt.jar=51446:D:\software\it_software\idea\idea2020\IntelliJ IDEA 2020.1\bin" -Dfile.encoding=UTF-8 -classpath D:\software\it_software\jdk\jdk\jre\lib\charsets.jar;D:\software\it_software\jdk\jdk\jre\lib\deploy.jar;D:\software\it_software\jdk\jdk\jre\lib\ext\access-bridge-32.jar;D:\software\it_software\jdk\jdk\jre\lib\ext\cldrdata.jar;D:\software\it_software\jdk\jdk\jre\lib\ext\dnsns.jar;D:\software\it_software\jdk\jdk\jre\lib\ext\jaccess.jar;D:\software\it_software\jdk\jdk\jre\lib\ext\jfxrt.jar;D:\software\it_software\jdk\jdk\jre\lib\ext\localedata.jar;D:\software\it_software\jdk\jdk\jre\lib\ext\nashorn.jar;D:\software\it_software\jdk\jdk\jre\lib\ext\sunec.jar;D:\software\it_software\jdk\jdk\jre\lib\ext\sunjce_provider.jar;D:\software\it_software\jdk\jdk\jre\lib\ext\sunmscapi.jar;D:\software\it_software\jdk\jdk\jre\lib\ext\sunpkcs11.jar;D:\software\it_software\jdk\jdk\jre\lib\ext\zipfs.jar;D:\software\it_software\jdk\jdk\jre\lib\javaws.jar;D:\software\it_software\jdk\jdk\jre\lib\jce.jar;D:\software\it_software\jdk\jdk\jre\lib\jfr.jar;D:\software\it_software\jdk\jdk\jre\lib\jfxswt.jar;D:\software\it_software\jdk\jdk\jre\lib\jsse.jar;D:\software\it_software\jdk\jdk\jre\lib\management-agent.jar;D:\software\it_software\jdk\jdk\jre\lib\plugin.jar;D:\software\it_software\jdk\jdk\jre\lib\resources.jar;D:\software\it_software\jdk\jdk\jre\lib\rt.jar;D:\maven_modle\target\classes;D:\mavrepo\org\apache\hadoop\hadoop-common\2.7.6\hadoop-common-2.7.6.jar;D:\mavrepo\org\apache\hadoop\hadoop-annotations\2.7.6\hadoop-annotations-2.7.6.jar;D:\mavrepo\com\google\guava\guava\11.0.2\guava-11.0.2.jar;D:\mavrepo\commons-cli\commons-cli\1.2\commons-cli-1.2.jar;D:\mavrepo\org\apache\commons\commons-math3\3.1.1\commons-math3-3.1.1.jar;D:\mavrepo\xmlenc\xmlenc\0.52\xmlenc-0.52.jar;D:\mavrepo\commons-httpclient\commons-httpclient\3.1\commons-httpclient-3.1.jar;D:\mavrepo\commons-codec\commons-codec\1.4\commons-codec-1.4.jar;D:\mavrepo\commons-io\commons-io\2.4\commons-io-2.4.jar;D:\mavrepo\commons-net\commons-net\3.1\commons-net-3.1.jar;D:\mavrepo\commons-collections\commons-collections\3.2.2\commons-collections-3.2.2.jar;D:\mavrepo\javax\servlet\servlet-api\2.5\servlet-api-2.5.jar;D:\mavrepo\org\mortbay\jetty\jetty\6.1.26\jetty-6.1.26.jar;D:\mavrepo\org\mortbay\jetty\jetty-util\6.1.26\jetty-util-6.1.26.jar;D:\mavrepo\org\mortbay\jetty\jetty-sslengine\6.1.26\jetty-sslengine-6.1.26.jar;D:\mavrepo\javax\servlet\jsp\jsp-api\2.1\jsp-api-2.1.jar;D:\mavrepo\com\sun\jersey\jersey-core\1.9\jersey-core-1.9.jar;D:\mavrepo\com\sun\jersey\jersey-json\1.9\jersey-json-1.9.jar;D:\mavrepo\org\codehaus\jettison\jettison\1.1\jettison-1.1.jar;D:\mavrepo\com\sun\xml\bind\jaxb-impl\2.2.3-1\jaxb-impl-2.2.3-1.jar;D:\mavrepo\org\codehaus\jackson\jackson-jaxrs\1.8.3\jackson-jaxrs-1.8.3.jar;D:\mavrepo\org\codehaus\jackson\jackson-xc\1.8.3\jackson-xc-1.8.3.jar;D:\mavrepo\com\sun\jersey\jersey-server\1.9\jersey-server-1.9.jar;D:\mavrepo\asm\asm\3.1\asm-3.1.jar;D:\mavrepo\commons-logging\commons-logging\1.1.3\commons-logging-1.1.3.jar;D:\mavrepo\log4j\log4j\1.2.17\log4j-1.2.17.jar;D:\mavrepo\net\java\dev\jets3t\jets3t\0.9.0\jets3t-0.9.0.jar;D:\mavrepo\org\apache\httpcomponents\httpclient\4.1.2\httpclient-4.1.2.jar;D:\mavrepo\org\apache\httpcomponents\httpcore\4.1.2\httpcore-4.1.2.jar;D:\mavrepo\com\jamesmurty\utils\java-xmlbuilder\0.4\java-xmlbuilder-0.4.jar;D:\mavrepo\commons-lang\commons-lang\2.6\commons-lang-2.6.jar;D:\mavrepo\commons-configuration\commons-configuration\1.6\commons-configuration-1.6.jar;D:\mavrepo\commons-digester\commons-digester\1.8\commons-digester-1.8.jar;D:\mavrepo\commons-beanutils\commons-beanutils\1.7.0\commons-beanutils-1.7.0.jar;D:\mavrepo\commons-beanutils\commons-beanutils-core\1.8.0\commons-beanutils-core-1.8.0.jar;D:\mavrepo\org\slf4j\slf4j-api\1.7.10\slf4j-api-1.7.10.jar;D:\mavrepo\org\slf4j\slf4j-log4j12\1.7.10\slf4j-log4j12-1.7.10.jar;D:\mavrepo\org\codehaus\jackson\jackson-core-asl\1.9.13\jackson-core-asl-1.9.13.jar;D:\mavrepo\org\codehaus\jackson\jackson-mapper-asl\1.9.13\jackson-mapper-asl-1.9.13.jar;D:\mavrepo\org\apache\avro\avro\1.7.4\avro-1.7.4.jar;D:\mavrepo\com\thoughtworks\paranamer\paranamer\2.3\paranamer-2.3.jar;D:\mavrepo\org\xerial\snappy\snappy-java\1.0.4.1\snappy-java-1.0.4.1.jar;D:\mavrepo\com\google\protobuf\protobuf-java\2.5.0\protobuf-java-2.5.0.jar;D:\mavrepo\com\google\code\gson\gson\2.2.4\gson-2.2.4.jar;D:\mavrepo\org\apache\hadoop\hadoop-auth\2.7.6\hadoop-auth-2.7.6.jar;D:\mavrepo\org\apache\directory\server\apacheds-kerberos-codec\2.0.0-M15\apacheds-kerberos-codec-2.0
Exception in thread "main" ExitCodeException exitCode=-1073741515: at org.apache.hadoop.util.Shell.runCommand(Shell.java:1007) at org.apache.hadoop.util.Shell.run(Shell.java:900) at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:1212) at org.apache.hadoop.util.Shell.execCommand(Shell.java:1306) at org.apache.hadoop.util.Shell.execCommand(Shell.java:1288) at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:978) at org.apache.hadoop.fs.RawLocalFileSystem.mkOneDirWithMode(RawLocalFileSystem.java:660) at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:700) at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:672) at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:699) at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:677) at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:336) at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:162) at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:113) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:148) at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1571) at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1568) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1568) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1589) at WordCount.main(WordCount.java:30)
最新发布
10-31
这是一个典型的 Hadoop 程序运行时出现的异常信息,属于“普通文本型”问题。 --- ### 异常分析 异常 `ExitCodeException exitCode=-1073741515` 通常在 Windows 系统上运行 Hadoop 程序时出现,表示本地执行某个 Shell 命令失败。该错误码对应的十六进制为 `0xC0000142`,含义是: > **"DLL 初始化失败"(或进程启动失败)**,常见于尝试执行 `winutils.exe` 时因缺少依赖库或权限问题导致。 堆栈跟踪显示错误发生在 Hadoop 调用 `Shell.runCommand` 创建临时目录并设置权限时: ```java at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:978) ``` Hadoop 尝试调用 `winutils.exe chmod` 设置文件夹权限,但由于 `winutils.exe` 无法正常启动而崩溃。 --- ### 解决方案 1. **确认已配置 `HADOOP_HOME` 和 `winutils.exe`** - 下载适用于 WindowsHadoop 二进制工具(如 [https://github.com/cdarlint/winutils](https://github.com/cdarlint/winutils)- 设置环境变量 `HADOOP_HOME` 指向其根目录 - 确保 `%HADOOP_HOME%\bin\winutils.exe` 存在 2. **以管理员身份运行命令行** - `winutils.exe` 需要管理员权限来模拟 Unix 权限操作 - 否则会因权限不足导致 DLL 加载失败 3. **禁用权限检查(开发环境临时方案)** 在代码中添加配置: ```java System.setProperty("hadoop.job.ugi", "user"); Configuration conf = new Configuration(); conf.set("fs.permissions.enable", "false"); // 关闭权限检查 ``` 4. **使用 WSL 或 Linux 环境运行** - 推荐生产开发迁移至 Linux/WSL,避免 Windows 兼容性问题 --- ### 常见原因总结 | 原因 | 说明 | |------|------| | 缺少 `winutils.exe` | HadoopWindows 上必须依赖此工具 | | 权限不足 | 未以管理员身份运行 IDE 或终端 | | 系统缺失运行库 | 如 Visual C++ Redistributable | | 版本不匹配 | Hadoop 版本与 winutils 不兼容 | 建议优先检查 `winutils.exe` 是否存在且可执行,并确保开发环境具备必要权限。
评论 2
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值