Hadoop支持LZO压缩配置

环境准备

准备lzo格式压缩文件
#检查是否有lzop命令
[hadoop@Gargantua software]$ which lzop
/bin/lzop
#若没有执行如下安装命令
[root@Gargantua ~]# yum install -y svn ncurses-devel
[root@Gargantua ~]# yum install -y gcc gcc-c++ make cmake
[root@Gargantua ~]# yum install -y openssl openssl-devel svn ncurses-devel zlib-devel libtool
[root@Gargantua ~]# yum install -y snappy snappy-devel bzip2 bzip2-devel lzo lzo-devel lzop autoconf automake cmake 

lzo压缩:lzop -v filename
lzo解压:lzop -dv filename

下载、安装并编译LZO

wget http://www.oberhumer.com/opensource/lzo/download/lzo-2.10.tar.gz

tar -zxvf lzo-2.10.tar.gz

cd lzo-2.10

./configure -prefix=/usr/local/hadoop/lzo/

make

make install

编译hadoop-lzo源码

2.1 下载hadoop-lzo的源码,下载地址:https://github.com/twitter/hadoop-lzo/archive/master.zip
2.2 解压之后,修改pom.xml
<hadoop.current.version>3.2.2</hadoop.current.version>
2.3 声明两个临时环境变量
export C_INCLUDE_PATH=/usr/local/hadoop/lzo/include
export LIBRARY_PATH=/usr/local/hadoop/lzo/lib
2.4 编译
进入hadoop-lzo-master,执行maven编译命令
mvn package -Dmaven.test.skip=true
2.5 进入target,hadoop-lzo-0.4.21-SNAPSHOT.jar 即编译成功的hadoop-lzo组件

配置hadoop关联jar

将编译好后的 hadoop-lzo-0.4.21-SNAPSHOT.jar 放入 $HADOOP_HOME/share/hadoop/common/

core-site.xml增加配置支持LZO压缩
    <property>
        <name>io.compression.codecs</name>
        <value>
            org.apache.hadoop.io.compress.GzipCodec,
            org.apache.hadoop.io.compress.DefaultCodec,
            org.apache.hadoop.io.compress.BZip2Codec,
            org.apache.hadoop.io.compress.SnappyCodec,
            com.hadoop.compression.lzo.LzoCodec,
            com.hadoop.compression.lzo.LzopCodec
        </value>
    </property>

    <property>
        <name>io.compression.codec.lzo.class</name>
        <value>com.hadoop.compression.lzo.LzoCodec</value>
    </property>

// 将一个大于128M的文件lzo压缩wc.data.lzo,并上传到hdfs

	[liqiang@Gargantua data]$lzop wc.data
	[liqiang@Gargantua data]$ hdfs dfs -put wc.data.lzo /input/wc.data.lzo

执行wc

	hadoop jar $HADOOP_HOME/share/hadoop/mapreduce/hadoop-mapreduce-examples-3.2.2.jar wordcount -Dmapreduce.output.fileoutputformat.compress=true -Dmapreduce.output.fileoutputformat.compress.codec=com.hadoop.compression.lzo.LzopCodec /input/wc.data.lzo /output/wc2

在这里插入图片描述

对lzo文件建立索引

hadoop jar $HADOOP_HOME/share/hadoop/common/hadoop-lzo-0.4.21-SNAPSHOT.jar  com.hadoop.compression.lzo.DistributedLzoIndexer /input/wc.data.lzo

hdfs dfs -rm -r /output/wc2
// 再次执行wc 发现还是不行。。。
单纯的做了索引还是不行的,在运行程序的时候还要对要运行的程序进行相应的更改。
把inputformat设置成LzoTextInputFormat,不然还是会把索引文件也当做是输入文件,还是只运行一个map来处理。
命令中增加:

-Dmapreduce.job.inputformat.class=com.hadoop.mapreduce.LzoTextInputFormat

参考:https://blog.youkuaiyun.com/qq_43081842/article/details/105455070

hadoop jar $HADOOP_HOME/share/hadoop/mapreduce/hadoop-mapreduce-examples-3.2.2.jar wordcount \
-Dmapreduce.job.inputformat.class=com.hadoop.mapreduce.LzoTextInputFormat   /input/wc.data.lzo /output/wc7

出现 number of splits:2
在这里插入图片描述

参考博文:
https://www.cnblogs.com/xuziyu/p/10729992.html
https://blog.youkuaiyun.com/zmzdmx/article/details/113655883
https://blog.youkuaiyun.com/qq_31405633/article/details/89353295

编译Hadoop(native code),以支持snappy 、bzip2…

需要下载Protoco1Buffer 2.5.0进行交装

安装依赖

	[root@Gargantua ~]# yum install -y svn ncurses-devel
	[root@Gargantua ~]# yum install -y gcc gcc-c++ make cmake
	[root@Gargantua ~]# yum install -y openssl openssl-devel svn ncurses-devel zlib-devel libtool
	[root@Gargantua ~]# yum install -y snappy snappy-devel bzip2 bzip2-devel lzo lzo-devel lzop autoconf automake cmake 
mvn clean package -Pdist -Pnative -Dtar

编译好后除了将linux上etc下配置文件的其他都覆盖就行。

其他记录(暂时放在这里)

useUnicode=true&characterEncoding=UTF-8 


export PROTOC_HOME=/home/hadoop/app/protobuf
export PATH=$PROTOC_HOME/bin:$PATH

cd $PROTOC_HOME
./configure --prefix=$PROTOC_HOME

make 
make install

make && make install 才会在protobuf目录下使用 protoc 命令

export CMAKE_HOME=/home/hadoop/app/cmake
export PATH=$CMAKE_HOME/bin:$PATH

export DOXYGEN_HOME=/home/hadoop/app/doxygen
export PATH=$DOXYGEN_HOME/bin:$PATH

export FINDBUGS_HOME=/home/hadoop/app/findbugs
export PATH=$FINDBUGS_HOME/bin:$PATH

export NODEJS_HOME=/home/hadoop/app/nodeJs
export PATH=$NODEJS_HOME/bin:$PATH


yum install -y openssl openssl-devel svn ncurses-devel zlib-devel libtool

yum install -y snappy snappy-devel bzip2 bzip2-devel lzo lzo-devel lzop autoconf automake



mvn clean package -Pdist,native -DskipTests -Dtar

cd hadoop-dist/target/ 


1. 下载、安装并编译LZO

wget http://www.oberhumer.com/opensource/lzo/download/lzo-2.10.tar.gz

tar -zxvf lzo-2.10.tar.gz

cd lzo-2.10

./configure -prefix=/usr/local/hadoop/lzo/

make

make install

2. 编译hadoop-lzo源码

2.1 下载hadoop-lzo的源码,下载地址:https://github.com/twitter/hadoop-lzo/archive/master.zip
2.2 解压之后,修改pom.xml
    <hadoop.current.version>3.2.2</hadoop.current.version>
2.3 声明两个临时环境变量
     export C_INCLUDE_PATH=/usr/local/hadoop/lzo/include
     export LIBRARY_PATH=/usr/local/hadoop/lzo/lib 
2.4 编译
    进入hadoop-lzo-master,执行maven编译命令
    mvn package -Dmaven.test.skip=true
2.5 进入target,hadoop-lzo-0.4.21-SNAPSHOT.jar 即编译成功的hadoop-lzo组件


将编译好后的 hadoop-lzo-0.4.21-SNAPSHOT.jar  放入 $HADOOP_HOME/share/hadoop/common/

core-site.xml增加配置支持LZO压缩




hadoop jar $HADOOP_HOME/share/hadoop/mapreduce/hadoop-mapreduce-examples-3.2.2.jar wordcount -Dmapreduce.output.fileoutputformat.compress=true -Dmapreduce.output.fileoutputformat.compress.codec=com.hadoop.compression.lzo.LzopCodec /input/wc.data.lzo /output/wc2

// 准备一个大于128M的文件,上传到/input/wordcount.data
hadoop jar $HADOOP_HOME/share/hadoop/common/hadoop-lzo-0.4.21-SNAPSHOT.jar  com.hadoop.compression.lzo.DistributedLzoIndexer /input/wc.data.lzo

hdfs dfs -rm -r /output/wc2
// 再次执行

https://blog.youkuaiyun.com/houzhizhen/article/details/42077589
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值