hadoop+eclipse 安装插件hadoop-eclipse-plugin-2.7.6.jar不能在eclipse中识别

本文介绍了如何在Windows上使用Eclipse安装hadoop-eclipse-plugin-2.7.6.jar插件,以便于进行Hadoop代码开发。步骤包括下载插件并放置到正确目录,以及验证插件安装成功后的设置和功能。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

我们已经习惯了windows,希望在Windows上进行hadoop代码开发,因为方便调试,如果要搭建这个运行环境,我们需要安装插件hadoop在eclipse中的插件hadoop-eclipse-plugin-2.7.6.jar包。**

一、下载hadoop-eclipse-plugin-2.7.6 .jar**,然后将其放在我们安装的eclipse软件下面的dropins或者plugins文件目录里面,有些时候hadoop-eclipse-plugin插件放在plugins,我们在打开eclipse开发环境时不能看到。因此需要放在这两个文件目录中的另外一个.。如下两图所示:**
在这里插入图片描述
在这里插入图片描述

二、将插件放入相应的文件目录后Window——show view ——other就可以看到加入了mapreduce的位置包了,如下两图所示:
在这里插入图片描述在这里插入图片描述

三、最后新建项目就看到有hadoop的开发项目选项了,如下图所示
在这里插入图片描述
希望能够帮到和我一样走在学习hadoop路上的人。

已经测试过workcount例子程序! [hadoop@test Desktop]$ hadoop jar wordcount.jar \ > /user/hadoop/input/file* /user/hadoop/output 18/05/25 19:51:32 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032 18/05/25 19:51:32 INFO mapreduce.JobSubmissionFiles: Permissions on staging directory /tmp/hadoop-yarn/staging/hadoop/.staging are incorrect: rwxrwxrwx. Fixing permissions to correct value rwx------ 18/05/25 19:51:34 INFO input.FileInputFormat: Total input paths to process : 3 18/05/25 19:51:35 INFO mapreduce.JobSubmitter: number of splits:3 18/05/25 19:51:35 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1527248744555_0001 18/05/25 19:51:36 INFO impl.YarnClientImpl: Submitted application application_1527248744555_0001 18/05/25 19:51:36 INFO mapreduce.Job: The url to track the job: http://test:8088/proxy/application_1527248744555_0001/ 18/05/25 19:51:36 INFO mapreduce.Job: Running job: job_1527248744555_0001 18/05/25 19:51:49 INFO mapreduce.Job: Job job_1527248744555_0001 running in uber mode : false 18/05/25 19:51:49 INFO mapreduce.Job: map 0% reduce 0% 18/05/25 19:52:20 INFO mapreduce.Job: map 100% reduce 0% 18/05/25 19:52:29 INFO mapreduce.Job: map 100% reduce 100% 18/05/25 19:52:31 INFO mapreduce.Job: Job job_1527248744555_0001 completed successfully 18/05/25 19:52:32 INFO mapreduce.Job: Counters: 49 File System Counters FILE: Number of bytes read=186 FILE: Number of bytes written=491001 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=471 HDFS: Number of bytes written=40 HDFS: Number of read operations=12 HDFS: Number of large read operations=0 HDFS: Number of write operations=2 Job Counters Launched map tasks=3 Launched reduce tasks=1 Data-local map tasks=3 Total time spent by all maps in occupied slots (ms)=86763 Total time spent by all reduces in occupied slots (ms)=5836 Total time spent by all map tasks (ms)=86763 Total time spent by all reduce tasks (ms)=5836 Total vcore-milliseconds taken by all map tasks=86763 Total vcore-milliseconds taken by all reduce tasks=5836 Total megabyte-milliseconds taken by all map tasks=88845312 Total megabyte-milliseconds taken by all reduce tasks=5976064 Map-Reduce Framework Map input records=6 Map output records=24 Map output bytes=225 Map output materialized bytes=198 Input split bytes=342 Combine input records=24 Combine output records=15 Reduce input groups=5 Reduce shuffle bytes=198 Reduce input records=15 Reduce output records=5 Spilled Records=30 Shuffled Maps =3 Failed Shuffles=0 Merged Map outputs=3 GC time elapsed (ms)=647 CPU time spent (ms)=4390 Physical memory (bytes) snapshot=893743104 Virtual memory (bytes) snapshot=8465371136 Total committed heap usage (bytes)=659030016 Shuffle Errors BAD_ID=0 CONNECTION=0 IO_ERROR=0 WRONG_LENGTH=0 WRONG_MAP=0 WRONG_REDUCE=0 File Input Format Counters Bytes Read=129 File Output Format Counters Bytes Written=40
评论 5
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值