Windows MapReduce应用报错:Mkdirs failed to create C:\Users\...\Temp\hadoop-unjarxxx\META-INF\license问题

本文介绍了解决在Windows系统下运行HadoopMapReduce应用时遇到的IOException错误的方法。通过修改pom.xml文件中的配置,排除了特定的META-INF文件,避免了创建临时目录失败的问题。

问题:

在Windows系统下运行Hadoop MapReduce应用报错如下:

E:\EclipsePro\ssm\aa>hadoop jar target\aa-0.0.1-SNAPSHOT.jar /1.txt /out5
Exception in thread "main" java.io.IOException: Mkdirs failed to create C:\Users\ADMINI~1\AppData\Local\Temp\hadoop-unjar3611278334038571997\META-INF\license
        at org.apache.hadoop.util.RunJar.ensureDirectory(RunJar.java:128)
        at org.apache.hadoop.util.RunJar.unJar(RunJar.java:104)
        at org.apache.hadoop.util.RunJar.unJar(RunJar.java:81)
        at org.apache.hadoop.util.RunJar.run(RunJar.java:209)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:136)

aa-0.0.1-SNAPSHOT.jar为自己编写并打包得到的MapReduce WordCount应用程序,参考:MapReduce WordCount应用 。

 

解决办法:

修改pom.xml文件

在<configuration>和</configuration>之间添加

            <filters>
              <filter>
                <artifact>*:*</artifact>
                <excludes>
                  <exclude>META-INF/*.SF</exclude>
                  <exclude>META-INF/*.DSA</exclude>
                  <exclude>META-INF/*.RSA</exclude>
                  <exclude>META-INF/LICENSE*</exclude>
                  <exclude>license/*</exclude>
                </excludes>
              </filter>
            </filters>

修改后效果截图如下:

重新打包程序

mvn clean package

再次运行jar,成功如下:

E:\EclipsePro\ssm\aa>hadoop jar target\aa-0.0.1-SNAPSHOT.jar /1.txt /o1
19/10/20 23:39:14 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
19/10/20 23:39:15 WARN mapreduce.JobResourceUploader: Hadoop command-line option parsing not performed. Implement the Tool interface and execute your application with ToolRunner to remedy this.
19/10/20 23:39:16 INFO input.FileInputFormat: Total input paths to process : 1
19/10/20 23:39:17 INFO mapreduce.JobSubmitter: number of splits:1
19/10/20 23:39:17 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1571582539006_0002
19/10/20 23:39:17 INFO impl.YarnClientImpl: Submitted application application_1571582539006_0002
19/10/20 23:39:17 INFO mapreduce.Job: The url to track the job: http://DESKTOP-EULLDM0:8088/proxy/application_1571582539006_0002/
19/10/20 23:39:17 INFO mapreduce.Job: Running job: job_1571582539006_0002
19/10/20 23:39:35 INFO mapreduce.Job: Job job_1571582539006_0002 running in uber mode : false
19/10/20 23:39:35 INFO mapreduce.Job:  map 0% reduce 0%
19/10/20 23:39:40 INFO mapreduce.Job:  map 100% reduce 0%
19/10/20 23:39:46 INFO mapreduce.Job:  map 100% reduce 100%
19/10/20 23:39:56 INFO mapreduce.Job: Job job_1571582539006_0002 completed successfully
19/10/20 23:39:56 INFO mapreduce.Job: Counters: 49
        File System Counters
                FILE: Number of bytes read=55
                FILE: Number of bytes written=238911
                FILE: Number of read operations=0
                FILE: Number of large read operations=0
                FILE: Number of write operations=0
                HDFS: Number of bytes read=117
                HDFS: Number of bytes written=25
                HDFS: Number of read operations=6
                HDFS: Number of large read operations=0
                HDFS: Number of write operations=2
        Job Counters
                Launched map tasks=1
                Launched reduce tasks=1
                Data-local map tasks=1
                Total time spent by all maps in occupied slots (ms)=2658
                Total time spent by all reduces in occupied slots (ms)=3281
                Total time spent by all map tasks (ms)=2658
                Total time spent by all reduce tasks (ms)=3281
                Total vcore-milliseconds taken by all map tasks=2658
                Total vcore-milliseconds taken by all reduce tasks=3281
                Total megabyte-milliseconds taken by all map tasks=2721792
                Total megabyte-milliseconds taken by all reduce tasks=3359744
        Map-Reduce Framework
                Map input records=2
                Map output records=4
                Map output bytes=41
                Map output materialized bytes=55
                Input split bytes=92
                Combine input records=0
                Combine output records=0
                Reduce input groups=3
                Reduce shuffle bytes=55
                Reduce input records=4
                Reduce output records=3
                Spilled Records=8
                Shuffled Maps =1
                Failed Shuffles=0
                Merged Map outputs=1
                GC time elapsed (ms)=68
                CPU time spent (ms)=1403
                Physical memory (bytes) snapshot=435302400
                Virtual memory (bytes) snapshot=592404480
                Total committed heap usage (bytes)=295174144
        Shuffle Errors
                BAD_ID=0
                CONNECTION=0
                IO_ERROR=0
                WRONG_LENGTH=0
                WRONG_MAP=0
                WRONG_REDUCE=0
        File Input Format Counters
                Bytes Read=25
        File Output Format Counters
                Bytes Written=25

参考:https://stackoverflow.com/questions/10522835/hadoop-java-io-ioexception-mkdirs-failed-to-create-some-path

 

完成! enjoy it!

遇到找不到依赖项 'org.apache.hadoop:hadoop-mapreduce-clientjobclient:3.3.6' 的情况,通常是在Java项目中使用Maven或Gradle这类构建工具时发生的。这个错误表示你在项目的pom.xml(对于Maven)或build.gradle(对于Gradle)文件中引用了Apache Hadoop MapReduce Job Client 3.3.6版本,但在实际编译或安装过程中,该版本的jar包并未正确添加到项目的类路径中。 解决这个问题的步骤如下: 1. **检查版本信息**:确保你的Maven或Gradle配置中指定的Hadoop版本与实际可用的版本一致。如果不是3.3.6,尝试下载对应版本的JAR包。 2. **添加依赖**: - Maven: 在pom.xml中添加正确的Hadoop依赖。如果是Maven,确保有如下配置: ```xml <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-mapreduce-client-jobclient</artifactId> <version>3.3.6</version> </dependency> ``` - Gradle: 如果使用的是Gradle,应在build.gradle中添加类似: ```groovy implementation 'org.apache.hadoop:hadoop-mapreduce-client-jobclient:3.3.6' ``` 3. **本地仓库查找**:确保你的本地Maven或Gradle仓库已经包含了这个依赖。如果没有,你需要从Maven中央仓库或其他源下载并添加到你的本地仓库。 4. **重新同步/构建**:在Maven中执行 `mvn clean install` 或者在Gradle中执行 `gradle build`,这将强制更新你的项目依赖。 5. **检查网络连接**:如果以上都正常,可能是网络问题导致无法下载依赖。确认你的机器能够访问Maven或Gradle的仓库服务器。 6. **排除冲突**:检查是否有其他依赖项引入了冲突的版本,可能需要调整它们的版本或者排除冲突。 如果你在公司内部环境,可能还需要检查公司的防火墙设置是否允许访问相关的外部库。
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值