windows下的eclipse连接hadoop集群中遇到的问题

本文介绍如何在Eclipse中配置Hadoop环境,并通过WordCount程序进行测试。文章详细描述了配置步骤,包括添加Hadoop路径、设置Map/Reduce视角等,并解决了权限问题及常见错误。

前提条件:
虚拟机上已部署好hadoop集群(hadoop-2.5.0)
在网上下载hadoop-eclipse-plugin-2.5.0.保存至eclipse\plugins中
配置步骤:
1 window->preferences,添加hadoop的解压包所在的路径
这里写图片描述
2 window->Open perspective->other->map/reduce->ok
这里写图片描述
右键->new hadoop location
这里写图片描述
map/reduce Master port所对应的参数是
yarn-site.xml中yarn.resourcemanager.scheduler.address
参数默认值是8030
DFS master port 所对应的参数是
hdfs-site.xml中dfs.namenode.rpc-address
port默认值是8020
完成后点击finish即可

遇到的问题:
Eclipse远程连接hadoop时 报 Permission denied
解决

<property>
    <name>dfs.permissions.enabled</name>
    <value>false</value>
    <description>
       If "true", enable permission checking in HDFS.
       If "false", permission checking is turned off,
       but all other behavior is unchanged.
       Switching from one parameter value to the other does not change the mode,
       owner or group of files or directories.
    </description>
 </property>

测试运行(以wordcount程序为例)
运行时报错:

log4j:WARN No appenders could be found for logger (org.apache.hadoop.util.Shell).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Exception in thread "main" java.lang.NullPointerException
    at java.lang.ProcessBuilder.start(ProcessBuilder.java:1010)
    at org.apache.hadoop.util.Shell.runCommand(Shell.java:482)
    at org.apache.hadoop.util.Shell.run(Shell.java:455)
    at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:702)
    at org.apache.hadoop.util.Shell.execCommand(Shell.java:791)
    at org.apache.hadoop.util.Shell.execCommand(Shell.java:774)
    at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:646)
    at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:434)
    at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:281)
    at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:125)
    at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:348)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
    at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:415)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
    at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
    at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1303)
    at WordCount.main(WordCount.java:70)

上面的提示信息是log4j没有配置导致的。但是出现上面的错误,是因为在Windows下运行Map/Reduce程序需要winutils.exe和hadoop.dll的支持,选择32位或者64位的,然后拷贝上面两个文件放到HADOOP_HOME/bin目录下,然后拷贝hadoop.dll到C:\Windows\System32目录下,做完上面的工作时候,再次运行WordCount。

评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值