Pig: using hive metastore in oozie

本文介绍如何在Hadoop环境中配置Pig与Hive的集成。包括编译Hive和Pig以支持Hadoop 2.3.0,更新本地及HDFS上的Pig共享库,以及通过Hue进行Hive元数据模式初始化。

When we run a pig job which using hive metastore table through hue. We need to locate all related jars to oozie sharelib

 

Prepare

a. compile hive-0.12.0 and hive-0.13.0 to against hadoop2.3.0

b. compile pig-0.12.0 to against hadoop2.3.0

 

Update local pig's sharelib

a. backups all jars in share/lib/pig

b. copy pig-0.12.0.jar  pig-0.12.0-withouthadoop.jar to  share/lib/pig

c. copy oozie-sharelib-pig-4.0.1.jar from backuped jars in share/lib/pig

d.  copy all jars in hive-0.13.0-bin/hcatalog/share/hcatalog  to share/lib/pig

e. copy all jars in hive-0.13.0-bin/lib to share/lib/pig

f.  copy mysql jdbc driver to share/lib/pig

Update hdfs's pig sharelib

a. delete share/lib/pig in hdfs

b. update sharelib in hdfs using

#oozie-setup.sh sharelib upgrade -fs hdfs://192.168.122.1:2014 -locallib share/   

 

Integrate with Hue

a.init hive metastore schema to 0.13.0 using

#schematool -dbType mysql -initSchemaTo 0.13.0

b.set hue configuration to use hive-0.12.0(not hive-0.13.0)

 

 

评论
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值