centos-hadoop-3.0版本编译

环境:

centos 7

protobuf 2.5(只能是2.5,高了或低了都不行)

maven 3.5.4(要3.3以上的)

hadoop 3.0

cmake 3.3


一、下载所需的库

yum install gcc gcc-c++ ncurses-devel perl openssl-devel


二、cmake安装

centos7自带的cmake版本过低,卸载:

sudo yum remove cmake


然后下载官方包:

wget https://cmake.org/files/v3.3/cmake-3.3.2.tar.gz
tar zxvf cmake-3.3.2.tar.gz
cd cmake-3.3.2 

编译安装

./configure
make
make install

修改文件名,并添加至环境变量

mv cmake-3.3.2  cmake 
sudo vim /etc/profile

PATH=/usr/local/cmake/bin:$PATH
export PATH

使改动生效

source /etc/profile  

验证:

cmake --version

出现版本为3.3即成功。



三、protobuf安装

只能是2.5版本

下载地址:https://github.com/google/protobuf/tree/v2.5.0

下载zip包,解压:

unzip protobuf-2.5.0 -d /usr/local

进入该文件夹,执行编译:

cd  /usr/local/protobuf-2.5.0
sudo ./autogen.sh

可能会出现报错

Google Test not present.  Fetching gtest-1.5.0 from the web...

这是因为protobuf仓库都搬到github上了,而2.5.0版本没有更新地址。

因此替换autogen.sh一段脚本:

# Check that gtest is present.  Usually it is already there since the
# directory is set up as an SVN external.
if test ! -e gtest; then
  echo "Google Test not present.  Fetching gtest-1.5.0 from the web..."
  wget https://github.com/google/googletest/archive/release-1.5.0.tar.gz
  tar xzvf release-1.5.0.tar.gz
  mv googletest-release-1.5.0 gtest
fi

自己从github下,然后解压并重命名。

然后执行autogen.sh即可。

然后生成configure文件。

执行:

sudo ./configure --prefix=/usr/local/protobuf-2.5.0
sudo make
sudo make install

完成安装后,将protobuf添加至环境变量:

sudo vim /etc/profile

export PATH=$PATH:/usr/local/protobuf-2.5.0/bin

然后

source /etc/profile

查看protobuf是否安装成功:

protoc  --version

出现版本号表示安装成功。



四、安装maven

到官网下载3.5.4版本:

http://maven.apache.org/download.cgi

选择二进制包下载并解压:

 tar -zxvf apache-maven-3.5.3-bin.tar.gz  -C /usr/local

添加环境变量:

sudo sudo vim /etc/profile
 
export MAVEN_HOME=/usr/soft/apache-maven-3.3.3
export PATH=.:$PATH:$JAVA_HOME/bin:$MAVEN_HOME/bin
source /etc/profile

检查是否安装成功:

mvn --version



五、hadoop编译

选择源码包下载:

http://hadoop.apache.org/releases.html

然后解压至/usr/local

tar -zxvf hadoop-3.0.3-src.tar.gz -C /usr/local

然后进入目录,进行编译:

cd /usr/local/hadoop-3.0.3-src
mvn clean package -Pdist,native -DskipTests -Dtar 

经过非常长时间的等待:我的是虚拟机,跑了四个小时:

[INFO] No site descriptor found: nothing to attach.
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop Main 3.0.3 ........................... SUCCESS [  5.485 s]
[INFO] Apache Hadoop Build Tools .......................... SUCCESS [  6.540 s]
[INFO] Apache Hadoop Project POM .......................... SUCCESS [  4.501 s]
[INFO] Apache Hadoop Annotations .......................... SUCCESS [ 13.965 s]
[INFO] Apache Hadoop Assemblies ........................... SUCCESS [  1.277 s]
[INFO] Apache Hadoop Project Dist POM ..................... SUCCESS [  5.486 s]
[INFO] Apache Hadoop Maven Plugins ........................ SUCCESS [ 19.216 s]
[INFO] Apache Hadoop MiniKDC .............................. SUCCESS [ 11.586 s]
[INFO] Apache Hadoop Auth ................................. SUCCESS [ 31.088 s]
[INFO] Apache Hadoop Auth Examples ........................ SUCCESS [ 11.317 s]
[INFO] Apache Hadoop Common ............................... SUCCESS [04:24 min]
[INFO] Apache Hadoop NFS .................................. SUCCESS [ 24.975 s]
[INFO] Apache Hadoop KMS .................................. SUCCESS [31:57 min]
[INFO] Apache Hadoop Common Project ....................... SUCCESS [  1.534 s]
[INFO] Apache Hadoop HDFS Client .......................... SUCCESS [37:19 min]
[INFO] Apache Hadoop HDFS ................................. SUCCESS [06:07 min]
[INFO] Apache Hadoop HDFS Native Client ................... SUCCESS [ 15.292 s]
[INFO] Apache Hadoop HttpFS ............................... SUCCESS [01:29 min]
[INFO] Apache Hadoop HDFS-NFS ............................. SUCCESS [ 18.799 s]
[INFO] Apache Hadoop HDFS-RBF ............................. SUCCESS [01:26 min]
[INFO] Apache Hadoop HDFS Project ......................... SUCCESS [  1.059 s]
[INFO] Apache Hadoop YARN ................................. SUCCESS [  0.720 s]
[INFO] Apache Hadoop YARN API ............................. SUCCESS [02:01 min]
[INFO] Apache Hadoop YARN Common .......................... SUCCESS [08:19 min]
[INFO] Apache Hadoop YARN Server .......................... SUCCESS [  1.130 s]
[INFO] Apache Hadoop YARN Server Common ................... SUCCESS [02:41 min]
[INFO] Apache Hadoop YARN Registry ........................ SUCCESS [ 44.406 s]
[INFO] Apache Hadoop YARN NodeManager ..................... SUCCESS [02:17 min]
[INFO] Apache Hadoop YARN Web Proxy ....................... SUCCESS [01:01 min]
[INFO] Apache Hadoop YARN ApplicationHistoryService ....... SUCCESS [32:12 min]
[INFO] Apache Hadoop YARN Timeline Service ................ SUCCESS [01:16 min]
[INFO] Apache Hadoop YARN ResourceManager ................. SUCCESS [03:02 min]
[INFO] Apache Hadoop YARN Server Tests .................... SUCCESS [01:01 min]
[INFO] Apache Hadoop YARN Client .......................... SUCCESS [01:26 min]
[INFO] Apache Hadoop YARN SharedCacheManager .............. SUCCESS [01:11 min]
[INFO] Apache Hadoop YARN Timeline Plugin Storage ......... SUCCESS [ 43.437 s]
[INFO] Apache Hadoop YARN TimelineService HBase Backend ... SUCCESS [06:09 min]
[INFO] Apache Hadoop YARN Timeline Service HBase tests .... SUCCESS [33:22 min]
[INFO] Apache Hadoop YARN Router .......................... SUCCESS [01:02 min]
[INFO] Apache Hadoop YARN Applications .................... SUCCESS [  1.314 s]
[INFO] Apache Hadoop YARN DistributedShell ................ SUCCESS [01:06 min]
[INFO] Apache Hadoop YARN Unmanaged Am Launcher ........... SUCCESS [ 56.187 s]
[INFO] Apache Hadoop YARN Site ............................ SUCCESS [  2.283 s]
[INFO] Apache Hadoop YARN UI .............................. SUCCESS [  3.786 s]
[INFO] Apache Hadoop YARN Project ......................... SUCCESS [02:04 min]
[INFO] Apache Hadoop MapReduce Client ..................... SUCCESS [  4.511 s]
[INFO] Apache Hadoop MapReduce Core ....................... SUCCESS [03:20 min]
[INFO] Apache Hadoop MapReduce Common ..................... SUCCESS [01:15 min]
[INFO] Apache Hadoop MapReduce Shuffle .................... SUCCESS [01:09 min]
[INFO] Apache Hadoop MapReduce App ........................ SUCCESS [01:59 min]
[INFO] Apache Hadoop MapReduce HistoryServer .............. SUCCESS [02:02 min]
[INFO] Apache Hadoop MapReduce JobClient .................. SUCCESS [02:30 min]
[INFO] Apache Hadoop MapReduce HistoryServer Plugins ...... SUCCESS [ 52.876 s]
[INFO] Apache Hadoop MapReduce NativeTask ................. SUCCESS [02:50 min]
[INFO] Apache Hadoop MapReduce Examples ................... SUCCESS [ 47.315 s]
[INFO] Apache Hadoop MapReduce ............................ SUCCESS [ 13.200 s]
[INFO] Apache Hadoop MapReduce Streaming .................. SUCCESS [01:13 min]
[INFO] Apache Hadoop Distributed Copy ..................... SUCCESS [01:24 min]
[INFO] Apache Hadoop Archives ............................. SUCCESS [ 19.201 s]
[INFO] Apache Hadoop Archive Logs ......................... SUCCESS [ 31.913 s]
[INFO] Apache Hadoop Rumen ................................ SUCCESS [01:41 min]
[INFO] Apache Hadoop Gridmix .............................. SUCCESS [01:48 min]
[INFO] Apache Hadoop Data Join ............................ SUCCESS [01:20 min]
[INFO] Apache Hadoop Extras ............................... SUCCESS [01:19 min]
[INFO] Apache Hadoop Pipes ................................ SUCCESS [ 11.843 s]
[INFO] Apache Hadoop OpenStack support .................... SUCCESS [02:07 min]
[INFO] Apache Hadoop Amazon Web Services support .......... SUCCESS [09:28 min]
[INFO] Apache Hadoop Kafka Library support ................ SUCCESS [02:02 min]
[INFO] Apache Hadoop Azure support ........................ SUCCESS [03:32 min]
[INFO] Apache Hadoop Aliyun OSS support ................... SUCCESS [02:32 min]
[INFO] Apache Hadoop Client Aggregator .................... SUCCESS [02:11 min]
[INFO] Apache Hadoop Mini-Cluster ......................... SUCCESS [03:22 min]
[INFO] Apache Hadoop Scheduler Load Simulator ............. SUCCESS [01:39 min]
[INFO] Apache Hadoop Resource Estimator Service ........... SUCCESS [03:13 min]
[INFO] Apache Hadoop Azure Data Lake support .............. SUCCESS [01:49 min]
[INFO] Apache Hadoop Tools Dist ........................... SUCCESS [05:19 min]
[INFO] Apache Hadoop Tools ................................ SUCCESS [  1.191 s]
[INFO] Apache Hadoop Client API ........................... SUCCESS [04:11 min]
[INFO] Apache Hadoop Client Runtime ....................... SUCCESS [03:26 min]
[INFO] Apache Hadoop Client Packaging Invariants .......... SUCCESS [  6.019 s]
[INFO] Apache Hadoop Client Test Minicluster .............. SUCCESS [06:32 min]
[INFO] Apache Hadoop Client Packaging Invariants for Test . SUCCESS [  3.883 s]
[INFO] Apache Hadoop Client Packaging Integration Tests ... SUCCESS [  2.013 s]
[INFO] Apache Hadoop Distribution ......................... SUCCESS [ 38.955 s]
[INFO] Apache Hadoop Client Modules ....................... SUCCESS [  1.057 s]
[INFO] Apache Hadoop Cloud Storage ........................ SUCCESS [  7.292 s]
[INFO] Apache Hadoop Cloud Storage Project 3.0.3 .......... SUCCESS [  0.584 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 04:28 h
[INFO] Finished at: 2018-07-15T03:08:00+08:00

出现这个表示成功。

编译好的文件在hadoop-3.0.3-src/hadoop-dist/target/hadoop-3.0.3中。和官网下载的二进制包相同。




编译过程中可能的问题:

1.protobuf版本不对,一定要是2.5版本的

2.cmake 报错 code 1: 缺少包:ncurses-devel,openssl-devel。

sudo yum install ncurses-devel openssl-devel


评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值