ubuntu10.10 eclipse编译hadoop1.0.X

本文详细记录了在本地环境中从零开始搭建Hadoop的过程,包括解决安装过程中遇到的各种问题,如protocolbuffer版本低、gcc问题、make命令报错、动态链接库缺失及Java版本不兼容等,并最终成功通过mvn命令完成项目的构建。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

参照http://wiki.apache.org/hadoop/EclipseEnvironment

eclipse、git、maven下载略

$ mvn install -DskipTests
$ mvn eclipse:eclipse -DdownloadSources=true -DdownloadJavadocs=true

1、报错

***********

[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [0.492s]

[INFO] Apache Hadoop Common .............................. FAILURE [2.357s]

[INFO] Apache Hadoop Common Project ...................... SKIPPED

***********

[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.6:run (compile-proto) on project hadoop-common: An Ant BuildException has occured: exec returned: 1 -> [Help 1]

经查资料protocol buffer版本低,卸载安装protobuf-2.4.1,根据readme安装protobuf,中间gcc有问题 ,make命令报错, 执行sudo apt-get install build-essential解决gcc问题,顺利安装protobuf

2、执行mvn报错

[exec] protoc: error while loading shared libraries: libprotobuf.so.7: cannot open shared object file: No such file or directory

参考

http://mail-archives.apache.org/mod_mbox/hadoop-mapreduce-user/201201.mbox/%3CCALY2=u6s0dGy=ihUN55HRNHD9BWdQ-=7j1jenzRLoZ-7znxLJg@mail.gmail.com%3E

执行sudo ldconfig

3、执行mvn又报错
java版本问题,卸载open jdk 改为sun jdk
4、mvn终于成功
****************
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop Main ................................ SUCCESS [3.690s]
[INFO] Apache Hadoop Project POM ......................... SUCCESS [5.225s]
[INFO] Apache Hadoop Annotations ......................... SUCCESS [0.338s]
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [0.248s]
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.136s]
[INFO] Apache Hadoop Auth ................................ SUCCESS [0.527s]
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [0.388s]
[INFO] Apache Hadoop Common .............................. SUCCESS [16:38.901s]
[INFO] Apache Hadoop Common Project ...................... SUCCESS [0.289s]
[INFO] Apache Hadoop HDFS ................................ SUCCESS [1:29.129s]
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [1:19.747s]
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [7.824s]
[INFO] hadoop-yarn ....................................... SUCCESS [0.720s]
[INFO] hadoop-yarn-api ................................... SUCCESS [6:59.223s]
[INFO] hadoop-yarn-common ................................ SUCCESS [49.016s]
[INFO] hadoop-yarn-server ................................ SUCCESS [0.250s]
[INFO] hadoop-yarn-server-common ......................... SUCCESS [1.033s]
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [0.946s]
[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [0.368s]
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [0.670s]
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [0.562s]
[INFO] hadoop-mapreduce-client ........................... SUCCESS [0.180s]
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [2.507s]
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [1.028s]
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [0.775s]
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [1.380s]
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [1.208s]
[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [10.745s]
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [1.176s]
[INFO] Apache Hadoop Client .............................. SUCCESS [1.763s]
[INFO] Apache Hadoop Archives ............................ SUCCESS [1.957s]
[INFO] Apache Hadoop HDFS Raid ........................... SUCCESS [1.738s]
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.085s]
[INFO] hadoop-yarn-applications .......................... SUCCESS [0.131s]
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [0.433s]
[INFO] hadoop-yarn-site .................................. SUCCESS [0.304s]
[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [0.987s]
[INFO] hadoop-mapreduce .................................. SUCCESS [0.147s]
[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [0.408s]
[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [0.380s]
[INFO] Apache Hadoop Rumen ............................... SUCCESS [0.661s]
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [0.470s]
[INFO] Apache Hadoop Data Join ........................... SUCCESS [0.432s]
[INFO] Apache Hadoop Extras .............................. SUCCESS [0.406s]
[INFO] Apache Hadoop Pipes ............................... SUCCESS [0.076s]
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [3.775s]
[INFO] Apache Hadoop Tools ............................... SUCCESS [0.140s]
[INFO] Apache Hadoop Distribution ........................ SUCCESS [0.541s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 28:18.640s
[INFO] Finished at: Sun Jul 01 10:40:24 CST 2012
[INFO] Final Memory: 47M/114M
eclipse导入成功

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值