centos 编译hadoop2.6.0-cdh-5.10.0源码

本文记录了Hadoop的编译过程及注意事项,包括Java版本选择、protoc安装等。同时介绍了如何通过指定参数在编译时加入Snappy压缩支持。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

在网上搜了一篇博客,然后按照上面的步骤进行操作,

http://blog.youkuaiyun.com/linlinv3/article/details/49358217
遇到的问题:
1、java版本:一定要jdk7,不能是jdk8,否则报错.提示java版本有问题。
2、安装protoc , ./configure –prefix=/usr/local/protoc/ ;
make; make install ;
https://my.oschina.net/ifraincoat/blog/502842
protoc –version 只有显示版本信息才表示安装成功。若没有显示,则在编译hadoop时,会提示找不到protoc命令。
注意要配置成环境变量:
export PATH=/usr/local/protoc/bin:$PATH
3、在编译protoc时,就已经进行过make,make install后,再次执行make,会报错

libtool: install: error: cannot install `libaprutil-1.la’ to a directory

http://lxsym.blog.51cto.com/1364623/739509

执行 make clean,重新解压,再编译即可。

以前觉得编译hadoop时很高大上的事情,现在自己开始编译,发现也没啥(可能是我运气好)。
网络是否稳定。

#####[INFO] Executed tasks

[INFO]
[INFO] — maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-dist —
[INFO] Building jar: /home/hadoop-2.6.0-cdh5.10.0/hadoop-dist/target/hadoop-dist-2.6.0-cdh5.10.0-javadoc.jar
[INFO] ————————————————————————
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Hadoop Main …………………………… SUCCESS [ 3.057 s]
[INFO] Apache Hadoop Build Tools …………………….. SUCCESS [ 0.946 s]
[INFO] Apache Hadoop Project POM …………………….. SUCCESS [ 1.372 s]
[INFO] Apache Hadoop Annotations …………………….. SUCCESS [ 2.150 s]
[INFO] Apache Hadoop Assemblies ……………………… SUCCESS [ 0.377 s]
[INFO] Apache Hadoop Project Dist POM ………………… SUCCESS [ 1.741 s]
[INFO] Apache Hadoop Maven Plugins …………………… SUCCESS [ 3.162 s]
[INFO] Apache Hadoop MiniKDC ………………………… SUCCESS [ 4.451 s]
[INFO] Apache Hadoop Auth …………………………… SUCCESS [ 4.575 s]
[INFO] Apache Hadoop Auth Examples …………………… SUCCESS [ 3.262 s]
[INFO] Apache Hadoop Common …………………………. SUCCESS [05:44 min]
[INFO] Apache Hadoop NFS ……………………………. SUCCESS [01:36 min]
[INFO] Apache Hadoop KMS ……………………………. SUCCESS [01:12 min]
[INFO] Apache Hadoop Common Project ………………….. SUCCESS [ 0.149 s]
[INFO] Apache Hadoop HDFS …………………………… SUCCESS [14:13 min]
[INFO] Apache Hadoop HttpFS …………………………. SUCCESS [02:11 min]
[INFO] Apache Hadoop HDFS BookKeeper Journal ………….. SUCCESS [05:33 min]
[INFO] Apache Hadoop HDFS-NFS ……………………….. SUCCESS [ 5.474 s]
[INFO] Apache Hadoop HDFS Project ……………………. SUCCESS [ 0.101 s]
[INFO] hadoop-yarn …………………………………. SUCCESS [ 0.061 s]
[INFO] hadoop-yarn-api ……………………………… SUCCESS [01:28 min]
[INFO] hadoop-yarn-common …………………………… SUCCESS [03:52 min]
[INFO] hadoop-yarn-server …………………………… SUCCESS [ 0.128 s]
[INFO] hadoop-yarn-server-common …………………….. SUCCESS [ 11.223 s]
[INFO] hadoop-yarn-server-nodemanager ………………… SUCCESS [04:46 min]
[INFO] hadoop-yarn-server-web-proxy ………………….. SUCCESS [ 3.372 s]
[INFO] hadoop-yarn-server-applicationhistoryservice ……. SUCCESS [ 7.077 s]
[INFO] hadoop-yarn-server-resourcemanager …………….. SUCCESS [ 14.744 s]
[INFO] hadoop-yarn-server-tests ……………………… SUCCESS [ 1.995 s]
[INFO] hadoop-yarn-client …………………………… SUCCESS [ 6.928 s]
[INFO] hadoop-yarn-applications ……………………… SUCCESS [ 0.181 s]
[INFO] hadoop-yarn-applications-distributedshell ………. SUCCESS [ 2.479 s]
[INFO] hadoop-yarn-applications-unmanaged-am-launcher ….. SUCCESS [ 2.218 s]
[INFO] hadoop-yarn-site …………………………….. SUCCESS [ 0.077 s]
[INFO] hadoop-yarn-registry …………………………. SUCCESS [ 5.517 s]
[INFO] hadoop-yarn-project ………………………….. SUCCESS [ 7.139 s]
[INFO] hadoop-mapreduce-client ………………………. SUCCESS [ 0.272 s]
[INFO] hadoop-mapreduce-client-core ………………….. SUCCESS [01:17 min]
[INFO] hadoop-mapreduce-client-common ………………… SUCCESS [ 14.681 s]
[INFO] hadoop-mapreduce-client-shuffle ……………….. SUCCESS [ 3.881 s]
[INFO] hadoop-mapreduce-client-app …………………… SUCCESS [ 9.018 s]
[INFO] hadoop-mapreduce-client-hs ……………………. SUCCESS [ 6.268 s]
[INFO] hadoop-mapreduce-client-jobclient ……………… SUCCESS [01:02 min]
[INFO] hadoop-mapreduce-client-hs-plugins …………….. SUCCESS [ 1.990 s]
[INFO] hadoop-mapreduce-client-nativetask …………….. SUCCESS [01:23 min]
[INFO] Apache Hadoop MapReduce Examples ………………. SUCCESS [ 5.734 s]
[INFO] hadoop-mapreduce …………………………….. SUCCESS [ 5.960 s]
[INFO] Apache Hadoop MapReduce Streaming ……………… SUCCESS [ 36.966 s]
[INFO] Apache Hadoop Distributed Copy ………………… SUCCESS [ 39.764 s]
[INFO] Apache Hadoop Archives ……………………….. SUCCESS [ 2.230 s]
[INFO] Apache Hadoop Archive Logs ……………………. SUCCESS [ 2.200 s]
[INFO] Apache Hadoop Rumen ………………………….. SUCCESS [ 5.225 s]
[INFO] Apache Hadoop Gridmix ………………………… SUCCESS [ 3.731 s]
[INFO] Apache Hadoop Data Join ………………………. SUCCESS [ 2.288 s]
[INFO] Apache Hadoop Ant Tasks ………………………. SUCCESS [ 2.038 s]
[INFO] Apache Hadoop Extras …………………………. SUCCESS [ 2.445 s]
[INFO] Apache Hadoop Pipes ………………………….. SUCCESS [ 12.417 s]
[INFO] Apache Hadoop OpenStack support ……………….. SUCCESS [ 6.917 s]
[INFO] Apache Hadoop Amazon Web Services support ………. SUCCESS [05:08 min]
[INFO] Apache Hadoop Azure support …………………… SUCCESS [ 31.225 s]
[INFO] Apache Hadoop Client …………………………. SUCCESS [ 6.233 s]
[INFO] Apache Hadoop Mini-Cluster ……………………. SUCCESS [ 1.772 s]
[INFO] Apache Hadoop Scheduler Load Simulator …………. SUCCESS [ 6.303 s]
[INFO] Apache Hadoop Tools Dist ……………………… SUCCESS [ 9.784 s]
[INFO] Apache Hadoop Tools ………………………….. SUCCESS [ 0.098 s]
[INFO] Apache Hadoop Distribution ……………………. SUCCESS [01:27 min]
[INFO] ————————————————————————
[INFO] BUILD SUCCESS
[INFO] ————————————————————————
[INFO] Total time: 56:15 min
[INFO] Finished at: 2017-04-11T20:49:29+08:00
[INFO] Final Memory: 170M/553M
[INFO] ————————————————————————
纪念一下: 从17:33-20:51,总算编译成功了,下载的jar包有
133M /root/.m2/repository
期间包括下载jdk,换jdk版本;重新编译 protoc;还算顺利。

—–2018-5-23 —–
最近hadoop需要支持snappy,bzip压缩,但是上次编译的时候,并没有指定该项,查阅网上资料,均支出要在hadoop编译的时候指定native ,才能支持snappy等,所以需要重新进行编译。
在原来的环境上,进行编译;
https://www.58jb.com/html/113.html

执行:
mvn clean package -DskipTests -Pdist,native -Dtar -Dsnappy.lib=/usr/local/lib -Dbundle.snappy
参数说明:

-Pdist,native  :把重新编译生成的hadoop动态库;  
-DskipTests  :跳过测试  
-Dtar        :最后把文件以tar打包  
-Dbundle.snappy :添加snappy压缩支持【默认官网下载的是不支持的】  
-Dsnappy.lib=/usr/local/lib  :指snappy在编译机器上安装后的库路径;

在/home/hadoop-2.6.0-cdh5.10.0/hadoop-dist/target/hadoop-2.6.0-cdh5.10.0/lib/native路径查看。

编译后,运行 bin/hadoop checknative即可查看hadoop支持的压缩格式。

[root@nihao hadoop-2.6.0-cdh5.10.0]# bin/hadoop checknative
2018-05-24 10:09:28,634 INFO  [main] bzip2.Bzip2Factory (Bzip2Factory.java:isNativeBzip2Loaded(70)) - Successfully loaded & initialized native-bzip2 library system-native
2018-05-24 10:09:28,644 INFO  [main] zlib.ZlibFactory (ZlibFactory.java:<clinit>(49)) - Successfully loaded & initialized native-zlib library
Native library checking:
hadoop:  true /home/console/hadoop_common/lib/native/libhadoop.so.1.0.0
zlib:    true /lib64/libz.so.1
snappy:  true /home/console/hadoop_common/lib/native/libsnappy.so.1
lz4:     true revision:10301
bzip2:   true /lib64/libbz2.so.1
openssl: false Cannot load libcrypto.so (libcrypto.so: 无法打开共享对象文件: 没有那个文件或目录)!
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值