hive-site.xml for hive-0.12.0

本文详细介绍如何在已安装Hadoop的环境下安装配置Apache Hive,包括安装Mysql作为metastore、配置hive-site.xml文件、安装Mysql JDBC Connector等关键步骤,并提供了常见错误及其解决方法。

原文地址:http://blog.yidooo.net/archives/apache-hive-installation.html

安装前

在安装Hive之前,请保证已经安装了Hadoop。

Apache Hive安装及配置

安装Mysql

本文选用mysql作为Hive的metastore。

1
sudo yum install mysql-server
  • 创建数据库
1
2
mysql> create database hive;
Query OK, 1 row affected (0.00 sec)
  • 修改数据库操作权限
1
2
3
4
5
mysql> grant all on hive.* to hive@ '%'  identified by 'hive' ;
Query OK, 0 rows affected (0.00 sec)
 
mysql> flush privileges;
Query OK, 0 rows affected (0.00 sec)

Hive安装

1
tar zcvf hive-0.12.0. tar .gz hive-0.12.0

Hive配置

1
2
3
4
5
cd conf
cp hive-default.xml.template hive-site.xml
cp hive- env .sh.template hive- env .sh
cp hive-log4j.properties.template hive-log4j.properties
cp hive- exec -log4j.properties.template hive- exec -log4j.properties
  • hive-site.xml
1
2
3
4
5
< property >
   < name >javax.jdo.option.ConnectionURL</ name >
   < value >jdbc:mysql://localhost:3306/hive</ value >
   < description >JDBC connect string for a JDBC metastore</ description >
</ property >
1
2
3
4
5
< property >
   < name >javax.jdo.option.ConnectionDriverName</ name >
   < value >com.mysql.jdbc.Driver</ value >
   < description >Driver class name for a JDBC metastore</ description >
</ property >
1
2
3
4
5
< property >
   < name >javax.jdo.option.ConnectionUserName</ name >
   < value >root</ value >
   < description >username to use against metastore database</ description >
</ property >
1
2
3
4
5
< property >
   < name >javax.jdo.option.ConnectionPassword</ name >
   < value >welcome123</ value >
   < description >password to use against metastore database</ description >
</ property >
1
2
3
4
5
6
7
8
9
10
11
< property >
   < name >hive.metastore.schema.verification</ name >
   < value >false</ value >
    < description >
    Enforce metastore schema version consistency.
    True: Verify that version information stored in metastore matches with one from Hive jars.  Also disable automatic
          schema migration attempt. Users are required to manully migrate schema after Hive upgrade which ensures
          proper metastore schema migration. (Default)
    False: Warn if the version information stored in metastore doesn't match with one from in Hive jars.
    </ description >
</ property >
  • hive-env.sh
1
2
3
4
5
# Set HADOOP_HOME to point to a specific hadoop install directory
HADOOP_HOME= /home/hadoop/hadoop-2 .2.0
 
# Hive Configuration Directory can be controlled by:
export HIVE_CONF_DIR= /home/hadoop/hive-0 .12.0 /conf

安装Mysql JDBC Connector

下载页面:http://www.mysql.com/downloads/connector/j/5.1.html

1
cp mysql-connector-java-5.1.26-bin.jar to hive /lib

测试

1
2
3
hive> create table test (key string);
OK
Time taken: 1.09 seconds
1
2
3
4
5
hive> create table test (key string);
hive> show tables;
OK
test
Time taken: 0.084 seconds, Fetched: 1 row(s)

常见错误

错误:ERROR 2002 (HY000): Can’t connect to local MySQL server through socket ‘/var/lib/mysql/mysql.sock’ (2)
解决方法:sudo service mysqld start
错误:ERROR 1044 (42000): Access denied for user ”@’localhost’ to database ‘hive’
解决方法:
[hadoop@zhenlong-master ~]$ mysql -h localhost -u root -p
Enter password:
错误:FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
这个错误的原因很多,因此需要进行调试。 启动hive带上调试参数,./hive -hiveconf hive.root.logger=DEBUG,console,从调试信息中可以获得错误详细信息。
如果错误信息为:
Caused by: org.datanucleus.exceptions.NucleusException: Attempt to invoke the “BoneCP” plugin to create a ConnectionPool gave an error : The specified datastore driver (“com.mysql.jdbc.Driver”) was not found in the CLASSPATH. Please check your CLASSPATH specification, and the name of the driver.
解决方法:将mysql的jdbc driver拷贝到hive/lib即可。
如果错误信息为:
Caused by: MetaException(message:Version information not found in metastore. )
解决方法:set hive.metastore.schema.verification = false
1
2
3
4
5
6
7
8
9
10
11
< property >
   < name >hive.metastore.schema.verification</ name >
   < value >false</ value >
    < description >
    Enforce metastore schema version consistency.
    True: Verify that version information stored in metastore matches with one from Hive jars.  Also disable automatic
          schema migration attempt. Users are required to manully migrate schema after Hive upgrade which ensures
          proper metastore schema migration. (Default)
    False: Warn if the version information stored in metastore doesn't match with one from in Hive jars.
    </ description >
  </ property >

Last login: Mon Dec 15 23:47:23 2025 [root@bigdata1 ~]# cd /opt [root@bigdata1 opt]# ls -l total 8 drwxr-xr-x. 17 root root 4096 Sep 21 2022 module drwxr-xr-x. 4 root root 4096 Sep 20 2022 software [root@bigdata1 opt]# pwd /opt [root@bigdata1 opt]# ls -l total 8 drwxr-xr-x. 17 root root 4096 Sep 21 2022 module drwxr-xr-x. 4 root root 4096 Sep 20 2022 software [root@bigdata1 opt]# cd module/ [root@bigdata1 module]# ls -l total 8 drwxr-xr-x. 6 root root 99 Sep 20 2022 apache-maven-3.6.3 drwxr-xr-x. 6 root root 164 Sep 20 2022 clickhouse drwxrwxr-x. 10 1000 1000 156 Sep 22 2022 flink-yarn drwxr-xr-x. 7 root root 187 Sep 23 2022 flume-1.9.0 drwxr-xr-x. 11 1000 1000 173 Sep 20 2022 hadoop-3.1.3 drwxr-xr-x. 7 root root 182 Sep 20 2022 hbase-2.2.3 drwxr-xr-x. 11 root root 196 Sep 20 2022 hive-3.1.2 drwxr-xr-x. 23 501 games 4096 Sep 20 2022 hudi-0.11.0 drwxr-xr-x. 7 10 143 245 Apr 2 2019 jdk1.8.0_212 drwxr-xr-x. 7 root root 101 Sep 20 2022 kafka-2.4.1 drwxr-xr-x. 4 root root 200 Sep 21 2022 maxwell-1.29.0 drwxr-xr-x. 4 root root 61 Sep 23 2022 redis drwxrwxr-x. 7 root root 4096 Oct 4 2021 redis-6.2.6 drwxr-xr-x. 14 1000 1000 223 Sep 23 2022 spark-3.1.1-yarn drwxr-xr-x. 8 root root 160 Sep 20 2022 zookeeper-3.5.7 [root@bigdata1 module]# cd .. [root@bigdata1 opt]# cd software/ [root@bigdata1 software]# ls -;l ls: cannot access -: No such file or directory -bash: l: command not found [root@bigdata1 software]# ls -l total 2867664 -rw-r--r--. 1 root root 67938106 Sep 20 2022 apache-flume-1.9.0-bin.tar.gz -rw-r--r--. 1 root root 278813748 Sep 20 2022 apache-hive-3.1.2-bin.tar.gz -rw-r--r--. 1 root root 9506321 Sep 20 2022 apache-maven-3.6.3-bin.tar.gz -rw-r--r--. 1 root root 9311744 Sep 20 2022 apache-zookeeper-3.5.7-bin.tar.gz -rw-r--r--. 1 root root 82788 Sep 20 2022 clickhouse-client-21.9.4.35.tgz -rw-r--r--. 1 root root 188699225 Sep 20 2022 clickhouse-common-static-21.9.4.35.tgz -rw-r--r--. 1 root root 856509550 Sep 20 2022 clickhouse-common-static-dbg-21.9.4.35.tgz -rw-r--r--. 1 root root 103467 Sep 20 2022 clickhouse-server-21.9.4.35.tgz -rw-r--r--. 1 root root 340089072 Sep 20 2022 flink-1.14.0-bin-scala_2.12.tgz -rw-r--r--. 1 root root 338075860 Sep 20 2022 hadoop-3.1.3.tar.gz -rw-r--r--. 1 root root 223539590 Sep 20 2022 hbase-2.2.3-bin.tar.gz -rw-r--r--. 1 root root 2856651 Sep 20 2022 hudi-0.11.0.src.tgz -rw-r--r--. 1 root root 60533507 Sep 5 2022 hudi-spark3.1-bundle_2.12-0.12.0.jar -rw-r--r--. 1 root root 195013152 Sep 20 2022 jdk-8u212-linux-x64.tar.gz -rw-r--r--. 1 root root 62358954 Sep 20 2022 kafka_2.12-2.4.1.tgz -rw-r--r--. 1 root root 70821938 Sep 20 2022 maxwell-1.29.0.tar.gz drwxr-xr-x. 2 root root 4096 Sep 20 2022 mysql5 -rw-r--r--. 1 root root 985600 May 5 2022 mysql-connector-java-5.1.37.jar -rw-r--r--. 1 root root 2476542 Sep 20 2022 redis-6.2.6.tar.gz drwxr-xr-x. 54 root root 4096 Sep 20 2022 RepMaven -rw-r--r--. 1 root root 228721937 Sep 20 2022 spark-3.1.1-bin-hadoop3.2.tgz [root@bigdata1 software]# [root@bigdata1 software]# [root@bigdata1 software]# cd .. [root@bigdata1 opt]# ls -l total 8 drwxr-xr-x. 17 root root 4096 Sep 21 2022 module drwxr-xr-x. 4 root root 4096 Sep 20 2022 software [root@bigdata1 opt]# cd module [root@bigdata1 module]# ls -l total 8 drwxr-xr-x. 6 root root 99 Sep 20 2022 apache-maven-3.6.3 drwxr-xr-x. 6 root root 164 Sep 20 2022 clickhouse drwxrwxr-x. 10 1000 1000 156 Sep 22 2022 flink-yarn drwxr-xr-x. 7 root root 187 Sep 23 2022 flume-1.9.0 drwxr-xr-x. 11 1000 1000 173 Sep 20 2022 hadoop-3.1.3 drwxr-xr-x. 7 root root 182 Sep 20 2022 hbase-2.2.3 drwxr-xr-x. 11 root root 196 Sep 20 2022 hive-3.1.2 drwxr-xr-x. 23 501 games 4096 Sep 20 2022 hudi-0.11.0 drwxr-xr-x. 7 10 143 245 Apr 2 2019 jdk1.8.0_212 drwxr-xr-x. 7 root root 101 Sep 20 2022 kafka-2.4.1 drwxr-xr-x. 4 root root 200 Sep 21 2022 maxwell-1.29.0 drwxr-xr-x. 4 root root 61 Sep 23 2022 redis drwxrwxr-x. 7 root root 4096 Oct 4 2021 redis-6.2.6 drwxr-xr-x. 14 1000 1000 223 Sep 23 2022 spark-3.1.1-yarn drwxr-xr-x. 8 root root 160 Sep 20 2022 zookeeper-3.5.7 [root@bigdata1 module]# cd hive-3.1.2/ [root@bigdata1 hive-3.1.2]# ls -l total 56 drwxr-xr-x. 3 root root 157 Sep 20 2022 bin drwxr-xr-x. 2 root root 4096 Sep 20 2022 binary-package-licenses drwxr-xr-x. 2 root root 4096 Sep 20 2022 conf drwxr-xr-x. 4 root root 34 Sep 20 2022 examples drwxr-xr-x. 7 root root 68 Sep 20 2022 hcatalog drwxr-xr-x. 2 root root 44 Sep 20 2022 jdbc drwxr-xr-x. 4 root root 12288 Sep 20 2022 lib -rw-r--r--. 1 root root 20798 Aug 23 2019 LICENSE drwxr-xr-x. 2 root root 50 Sep 23 2022 logs -rw-r--r--. 1 root root 230 Aug 23 2019 NOTICE -rw-r--r--. 1 root root 2469 Aug 23 2019 RELEASE_NOTES.txt drwxr-xr-x. 4 root root 35 Sep 20 2022 scripts [root@bigdata1 hive-3.1.2]# cd conf/ [root@bigdata1 conf]# ls -l total 340 -rw-r--r--. 1 root root 1596 Aug 23 2019 beeline-log4j2.properties.template -rw-r--r--. 1 root root 300482 Aug 23 2019 hive-default.xml.template -rw-r--r--. 1 root root 2365 Aug 23 2019 hive-env.sh.template -rw-r--r--. 1 root root 2274 Aug 23 2019 hive-exec-log4j2.properties.template -rw-r--r--. 1 root root 3086 Aug 23 2019 hive-log4j2.properties.template -rw-r--r--. 1 root root 1469 Sep 20 2022 hive-site.xml -rw-r--r--. 1 root root 2060 Aug 23 2019 ivysettings.xml -rw-r--r--. 1 root root 3558 Aug 23 2019 llap-cli-log4j2.properties.template -rw-r--r--. 1 root root 7163 Aug 23 2019 llap-daemon-log4j2.properties.template -rw-r--r--. 1 root root 204 Sep 20 2022 log4j.properties -rw-r--r--. 1 root root 2662 Aug 23 2019 parquet-logging.properties 以上是hive的配置文件,修改哪一个文件?
最新发布
12-16
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值