Hadoop2.2.0源码编译

本文详细介绍了在CentOS6.5环境下编译Hadoop2.2.0源码的过程,包括安装必要的组件如JDK、Maven、Ant等,以及解决编译过程中的bug和依赖问题。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

       Hadoop2.2.0源码编译

一、安装CentOS6.5

    下载地址是:http://www.centoscn.com/CentosSoft/iso/,注意是64位的,大小是4GB,需要下载一段时间的。其实6.x的版本都可以,不一定是6.5。我使用的是VMWare虚拟机,分配了2GB内存,20GB磁盘空间。内存太小,会比较慢;磁盘太小,编译时可能会出现空间不足的情况。上述不是最低配置,根据自己的机器配置修改吧。还有,一定要保持linux联网状态。

       注:以下是下载安装各种软件,我把软件下载后全部复制到/usr/local/app/目录下;注意下载各软件包的版本及安装路径。

二、下载必要的组件

1、 下载hadoop-2.2.0源码

    地址: https://archive.apache.org/dist/hadoop/common/hadoop-2.2.0/

      选择:hadoop-2.2.0-src.tar.gz
2、下载apache-ant
      (centos自带的ant版本太低,编译过程中会报错)
      地址:http://archive.apache.org/dist/ant/binaries/
      选择:apache-ant-1.9.4-bin.tar.gz
3、下载protobuf-2.5.0.tar.gz 
     (这是google出品的一个数据传输格式)
     地址: https://developers.google.com/protocol-buffers/docs/downloads (官网地址要翻!!,百度上也能找到国内下载地址)
     注意:hadoop2.6.0必须配protobuf 2.5.0版本,版本不匹配,编译将失败
4、下载findbugs
      地址:http://sourceforge.net/projects/findbugs/files/findbugs/3.0.0/
      选择:findbugs-3.0.0.tar.gz
5、下载maven
       地址:http://maven.apache.org/download.cgi
       选择:apache-maven-3.0.5-bin.tar.gz
6、下载jdk
       地址:http://www.oracle.com/technetwork/java/javase/downloads/java-archive-downloads-javase7-521261.html
       选择:Java SE Development Kit 7u45   ――〉 jdk-7u45-linux-x64.tar.gz

三、安装各组件

1. 安装JDK

               1.1 解压缩:tar  -zxvf  jdk-7u45-linux-x64.tar.gz
               1.2 重命名:mv  jdk1.7.0_45  jdk
               1.3 设置环境变量:
                     1.3.1 进入文件:vi /etc/profile
                     1.3.2 在文件中新增内容: export  JAVA_HOME=/usr/local/app/jdk
                                              export  PATH=.:$PATH:$JAVA_HOME/bin

                             1.4保存退出文件后,执行以下命令:

           source /etc/profile

                              1.5 验证:java  -version

2. 安装maven

2.1 解压缩:tar -zxvf  apache-maven-3.0.5-bin.tar.gz
2.2 重命名:mv  apache-maven-3.0.5  maven
2.3 设置环境变量:
     2.3.1 进入文件:vi /etc/profile
     2.3.2 在文件中新增内容: export  MAVEN_HOME=/usr/local/app/maven
                           修改: export  PATH=.:$PATH:$JAVA_HOME/bin:$MAVEN_HOME/bin

2.4 保存退出文件后,执行以下命令:

              source /etc/profile

2.5 验证:mvn  -version

3、安装ant

3.1 解压缩:tar -zxvf  apache-ant-1.9.4-bin.tar.gz
3.2 重命名:mv  apache- ant-1.9.4  ant
3.3 设置环境变量:
     3.3.1 进入文件:vi /etc/profile
     3.3.2 在文件中新增内容: export  ANT_HOME=/usr/local/app/ant
              修改: export  PATH=.:$PATH:$JAVA_HOME/bin:$MAVEN_HOME/bin:$ANT_HOME/bin
3.4 保存退出文件后,执行以下命令:

              source /etc/profile

3.5 验证:ant  -version

4、安装findbugs(可选步骤)

4.1 解压缩:tar -zxvf  findbugs-3.0.0.tar.gz
4.2 重命名:mv  findbugs-3.0.0  findbugs
4.3 设置环境变量:
     4.3.1 进入文件:vi /etc/profile
     4.3.2 在文件中新增内容: export  FINDBUGS_HOME=/usr/local/app/findbugs
      修改: export         PATH=.:$PATH:$JAVA_HOME/bin:$MAVEN_HOME/bin:$ANT_HOME/bin:$            FINDBUGS_HOME/bin:$FINDBUGS_HOME/bin
4.4 保存退出文件后,执行以下命令:

              source /etc/profile

4.5 验证:findbugs  -version

5、安装protoc

5.1 准备:hadoop使用protocol buffer通信,为了编译安装protoc,需要下载几个工具,顺序执行以下命               令:
             yum install gcc  
             yum intall gcc-c++  
             yum install make
     注:如果操作系统是CentOS6.5那么gccmake已经安装了。其他版本不一定。在命令运行时,需要用户经常输入“y”。
5.2 解压缩:tar -zxvf  protobuf-2.5.0.tar.gz
5.3 重命名:mv  protobuf-2.5.0  protobuf
5.4  进入protobuf 目录:cd  protobuf 
     顺序执行命令:
            ./configure --prefix=/usr/local/protoc/  
            make && make install
5.5 设置环境变量:
     5.4.1 进入文件:vi /etc/profile
     5.4.2 在文件中新增内容: export  PROTOC_HOME=/usr/local/app/protoc
                            修改:export                                                                                             PATH=.:$PATH:$JAVA_HOME/bin:$MAVEN_HOME/bin:$ANT_HOME/bin:$ FINDBUGS_HOME/bin:$FINDBUGS_HOME/bin:$PROTOC_HOME/bin
5.6保存退出文件后,执行以下命令:

              source /etc/profile

5.7 验证:protoc  --version

6、安装其他依赖

顺序执行以下命令
            yum  install  cmake  
          yum  install  openssl-devel  
            yum  install  ncurses-devel
安装完毕即可。

四、编译hadoop2.2源码

 1、解压缩

           tar  -zxvf  hadoop-2.2.0-src.tar.gz

 2、bug修改

 2.1 源代码中有个bug,这里需要修改一下,编辑目录/usr/local/app/hadoop-2.2.0-src        /hadoop-common-project/hadoop-auth中的文件pom.xml
    执行以下命令
gedit pom.xml
2.2在第55行下增加以下内容

                <dependency>

                    <groupId>org.mortbay.jetty</groupId>

                    <artifactId>jetty-util</artifactId>

                    <scope>test</scope>

                </dependency>

    保存退出即可。
            2.3上述bug详见https://issues.apache.org/jira/browse/HADOOP-10110,在hadoop3中修复了,离             我们太遥远了。

  3、编译源码:

              3.1 好了,现在进入到目录cd /usr/local/app/hadoop-2.2.0-src中,执行命令:

                            mvnpackage -DskipTests -Pdist,native,docs

              3.2 如果没有执行第4步,把上面命令中的docs去掉即可,就不必生成文档了。

                     该命令会从外网下载依赖的jar,编译hadoop源码,需要花费很长时间,你可以吃饭了。

           在等待n久之后,可以看到如下的结果:

[INFO] Apache Hadoop Main................................ SUCCESS [6.936s]

[INFO] Apache Hadoop Project POM......................... SUCCESS [4.928s]

[INFO] Apache Hadoop Annotations......................... SUCCESS [9.399s]

[INFO] Apache Hadoop Assemblies.......................... SUCCESS [0.871s]

[INFO] Apache Hadoop Project Dist POM.................... SUCCESS [7.981s]

[INFO] Apache Hadoop Maven Plugins....................... SUCCESS [8.965s]

[INFO] Apache Hadoop Auth ................................SUCCESS [39.748s]

[INFO] Apache Hadoop Auth Examples....................... SUCCESS [11.081s]

[INFO] Apache Hadoop Common.............................. SUCCESS [10:41.466s]

[INFO] Apache Hadoop NFS................................. SUCCESS [26.346s]

[INFO] Apache Hadoop Common Project...................... SUCCESS [0.061s]

[INFO] Apache Hadoop HDFS................................ SUCCESS [12:49.368s]

[INFO] Apache Hadoop HttpFS.............................. SUCCESS [41.896s]

[INFO] Apache Hadoop HDFS BookKeeper Journal............. SUCCESS [41.043s]

[INFO] Apache Hadoop HDFS-NFS............................ SUCCESS [9.650s]

[INFO] Apache Hadoop HDFS Project........................ SUCCESS [0.051s]

[INFO] hadoop-yarn .......................................SUCCESS [1:22.693s]

[INFO] hadoop-yarn-api................................... SUCCESS [1:20.262s]

[INFO] hadoop-yarn-common................................ SUCCESS [1:30.530s]

[INFO] hadoop-yarn-server................................ SUCCESS [0.177s]

[INFO] hadoop-yarn-server-common......................... SUCCESS [15.781s]

[INFO] hadoop-yarn-server-nodemanager.................... SUCCESS [40.800s]

[INFO] hadoop-yarn-server-web-proxy...................... SUCCESS [6.099s]

[INFO] hadoop-yarn-server-resourcemanager................ SUCCESS [37.639s]

[INFO] hadoop-yarn-server-tests.......................... SUCCESS [4.516s]

[INFO] hadoop-yarn-client................................ SUCCESS [25.594s]

[INFO] hadoop-yarn-applications ..........................SUCCESS [0.286s]

[INFO] hadoop-yarn-applications-distributedshell......... SUCCESS [10.143s]

[INFO] hadoop-mapreduce-client........................... SUCCESS [0.119s]

[INFO] hadoop-mapreduce-client-core...................... SUCCESS [55.812s]

[INFO] hadoop-yarn-applications-unmanaged-am-launcher.... SUCCESS [8.749s]

[INFO] hadoop-yarn-site.................................. SUCCESS [0.524s]

[INFO] hadoop-yarn-project............................... SUCCESS [16.641s]

[INFO] hadoop-mapreduce-client-common.................... SUCCESS [40.796s]

[INFO] hadoop-mapreduce-client-shuffle................... SUCCESS [7.628s]

[INFO] hadoop-mapreduce-client-app....................... SUCCESS [24.066s]

[INFO] hadoop-mapreduce-client-hs ........................SUCCESS [13.243s]

[INFO] hadoop-mapreduce-client-jobclient................. SUCCESS [16.670s]

[INFO] hadoop-mapreduce-client-hs-plugins................ SUCCESS [3.787s]

[INFO] Apache Hadoop MapReduce Examples.................. SUCCESS [17.012s]

[INFO] hadoop-mapreduce.................................. SUCCESS [6.459s]

[INFO] Apache Hadoop MapReduce Streaming................. SUCCESS [12.149s]

[INFO] Apache Hadoop Distributed Copy.................... SUCCESS [15.968s]

[INFO] Apache Hadoop Archives............................ SUCCESS [5.851s]

[INFO] Apache Hadoop Rumen............................... SUCCESS [18.364s]

[INFO] Apache Hadoop Gridmix............................. SUCCESS [14.943s]

[INFO] Apache Hadoop Data Join ...........................SUCCESS [9.648s]

[INFO] Apache Hadoop Extras.............................. SUCCESS [5.763s]

[INFO] Apache Hadoop Pipes............................... SUCCESS [16.289s]

[INFO] Apache Hadoop Tools Dist.......................... SUCCESS [3.261s]

[INFO] Apache Hadoop Tools............................... SUCCESS [0.043s]

[INFO] Apache Hadoop Distribution........................ SUCCESS [56.188s]

[INFO] Apache Hadoop Client.............................. SUCCESS [10.910s]

[INFO] Apache Hadoop Mini-Cluster........................ SUCCESS [0.321s]

[INFO]------------------------------------------------------------------------

[INFO] BUILD SUCCESS

[INFO]------------------------------------------------------------------------

[INFO] Total time: 40:00.444s

[INFO] Finished at: Thu Dec 26 12:42:24 CST 2013

[INFO] Final Memory: 109M/362M

[INFO]------------------------------------------------------------------------

 

         3.3好了,编译完成了。编译后的代码在/usr/local/app/hadoop-2.2.0-src/hadoop-dist/target下面。

 

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值