hadoop-2.6.0-src源码导入Eclipse

本文介绍Hadoop源码的编译步骤及过程中遇到的常见错误解决方法,包括AvroRecord类缺失、protobuf配置问题等,并提供详细的解决步骤。

一.导入

先修改源码 参考 二.改错里面的第3条

1.cd到 hadoop-2.6.0-src/hadoop-maven-plugins

mvn install

2.在cd到hadoop-2.6.0-src 目录下

mvn eclipse:eclipse -DskipTests

3. 在Eclipse中:File-Import-Existing Projects into Workspace 选择hadoop-2.6.0-src目录导入

二.改错

1./hadoop-common/src/test/java/org/apache/hadoop/io/serializer/avro/TestAvroSerialization.java显示找不到AvroRecord类

http://grepcode.com/file/repo1.maven.org/maven2/org.apache.hadoop/hadoop-common/2.5.0/org/apache/hadoop/io/serializer/avro/AvroRecord.java/ 下载2.5.0或者更高版本的。放到相应位置

2.org.apache.hadoop.ipc 下面没有protobuf文件夹,也缺少相应的java文件,在org.apache.hadoop.ipc下面建立protobuf文件夹,把/hadoop-common/target/generated-sources/java/org/apache/hadoop/ipc/protobuf 里面的响应java文件移到该文件夹下-不要复制,复制会出现重复定义的错误。

3.hadoop-streaming里面的build path有问题,显示home/haodoop/桌面/hadoop-2.6.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/conf(missing)

我已经把所有源码文件已经导入到Eclipse 的工作目录workspace了,但是它指向的是桌面的源-我之前在 步骤一.导入 时用mvn install ;mvn eclipse:eclipse时的目录,根据网上说 右键hadoop-streaming项目-properties->左边Java Build Path->Source->选定错误项remove掉引用,当时好了,下次打开Eclipse还是有相同的问题。。我对java ,Eclipse不熟,下面是我的解决办法

解决:选中hadoop-streaming项目,在hadoop-stream目录下新建文件夹config,把hadoop-2.6.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/conf 中的文件拷贝到新建的config下,右键-properties->左边Java Build Path->Source->选定错误项 ,右边有edit

hadoop-streaming目录下的pom.xml文件,找到 :

<directory>${basedir}/../../hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/conf</directory>
 修改为:
<directory>${basedir}/config</directory> 

然后重新按照 一.导入中的步骤重新mvn intall,mvn eclipse:eclipse,再在Eclipse中删除掉有错的hadoop-streaming项目,然后再导入刚刚mvn生成的hadoop-streaming项目,发现 右键-properties->左边Java Build Path->Source 里面少了conf 出错的项,只有两项了,不放心就又用add Folder新建了一项

Error#1. hadoop-streaming里面的build path有问题,显示/root/workspace/hadoop-2.2.0-src/hadoop-yarn-project/hadoop-yarn/hadoop-yarn-server/hadoop-yarn-server-resourcemanager/conf(missing)

解决办法,remove掉引用就好。


Error#2. hadoop-hdfs/src/test/java/org/apache/hadoop/hdfs/TestDFSClientFailover.java中报sun.net.spi.nameservice.NameService错误,这是一个需要import的包,存在于openjdk中,在Oracle Jdk中没找到,需要下载一个。NameService是一个接口,在网上找一个NameService放到该包中就好。http://grepcode.com/file/repository.grepcode.com/java/root/jdk/openjdk/7u40-b43/sun/net/spi/nameservice/NameService.java#NameService


Error#3. /hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/tools/offlineEditsViewer/XmlEditsVisitor.java里面显示 
import com.sun.org.apache.xml.internal.serialize.OutputFormat; 
import com.sun.org.apache.xml.internal.serialize.XMLSerializer; 
失败,这是由于Eclipse的强检查原则,打开Java -> Compiler -> Errors/Warnings and under "Deprecated and restricted API" change the setting of "Forbidden reference (access rules)" 将error级别调整到warning级别就好。


Error#4. /hadoop-common/src/test/java/org/apache/hadoop/io/serializer/avro/TestAvroSerialization.java显示没有AvroRecord类,在网上搜索到AvroRecord类放入到同级包中就行了。  http://grepcode.com/file/repo1.maven.org/maven2/org.apache.hadoop/hadoop-common/2.2.0/org/apache/hadoop/io/serializer/avro/AvroRecord.java#AvroRecord


Error#5. org.apache.hadoop.ipc.protobuf包是空的,需要在/hadoop-common/target/generated-sources/java中找到profobuf拷贝到/hadoop-common/src/test/java中就好了. 同时包里面还缺少了以下三个引用,在GrepCode上找一下,把hadoop-common2.2.0的相应文件下下来导入。

org.apache.hadoop.ipc.protobuf.TestProtos.EchoRequestProto;
org.apache.hadoop.ipc.protobuf.TestProtos.EchoResponseProto;
org.apache.hadoop.ipc.protobuf.TestRpcServiceProtos.TestProtobufRpcProto;



评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值