Hadoop专栏(十一)在伪分布式中搭建HIVE(附解决jdk版本问题报错)

本文详细指导如何在Linux系统上下载、解压Hive 1.1.0 CDH5.14.2版本,配置环境变量、hive-site.xml,包括MySQL连接信息及权限设置。重点解决JDK版本冲突导致的问题,适合初学者参考。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

一、下载地址

下载地址http://archive.cloudera.com/cdh5/cdh/5/
也可以安装apache版本,下载地址http://archive.apache.org/dist/hive/
需要的软件包:hive-1.1.0-cdh5.14.2.tar.gz
1、将软件包上传到Linux系统指定目录下: /opt/software
2、解压到指定的目录:/opt/install(学习环境直接使用root用户即可)

[root@hadoop101 software]$  tar -zxvf hive-1.1.0-cdh5.14.2.tar.gz -C /opt/install
配置软链接:
[root@hadoop101 software]$ cd /opt/install/
[root@hadoop101 install]$ ln -s /opt/install/hive-1.1.0-cdh5.14.2/ hive

3、配置环境变量

[root@hadoop101 ~]$  vi /etc/profile
添加如下两行:
export HIVE_HOME=/opt/install/hive
export PATH=$PATH:$HIVE_HOME/bin
环境变量生效:
[root@hadoop101 ~]$ source /etc/profile

4、编辑hive-site.xml 文件

[root@hadoop101 ~]$ cd /opt/install/hive/conf/
[root@hadoop101 ~]$ vi hive-site.xml

注意:此处的hadoop1应该改成自己的主机地址或者映射地址

<configuration>
	<property>
		<name>hive.metastore.warehouse.dir</name>
		<value>/home/hadoop/hive/warehouse</value>
	</property>
	<!-- mysql 数据配置 -->
	<property>
		<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://hadoop101:3306/hive?createDatabaseIfNotExist=true</value>
	</property>
	<property>
	  <name>javax.jdo.option.ConnectionDriverName</name>
	  <value>com.mysql.jdbc.Driver</value>
	</property>
	<!-- 设置数据库用户名 -->
	<property>
	  <name>javax.jdo.option.ConnectionUserName</name>
	  <value>hive</value>
	</property>
	<!-- 设置用户密码 -->
	<property>
	  <name>javax.jdo.option.ConnectionPassword</name>
	  <value>hive</value>
	</property>
	<!-- 配置 Hive 临时文件存储地址 -->
	<property>
	  <name>hive.exec.scratchdir</name>
	  <value>/home/hadoop/hive/data/hive-${user.name}</value>
	  <description>Scratch space for Hive jobs</description>
	</property>

	<property>
	  <name>hive.exec.local.scratchdir</name>
	  <value>/home/hadoop/hive/data/${user.name}</value>
	  <description>Local scratch space for Hive jobs</description>
	</property>
</configuration>

注:一般在实际开发中,会修改权限,配置同组权限: hdfs dfs -mkdir /tmp hdfs dfs
-mkdir /home/hadoop/hive/warehouse 企业生产环境一般设置成:
hdfs dfs -chmod g+w /tmp hdfs dfs -chmod g+w /home/hadoop/hive/warehouse

说明:我们测试环境中这里也直接修改成777(也可以出现权限问题时再处理):
hdfs dfs -chmod -R 777 /tmp
hdfs dfs -chmod -R 777 /home/hadoop/hive/warehouse

5.修改hive-env.xml

[root@hadoop101 conf]$  mv hive-env.sh.template hive-env.sh
[root@hadoop101 conf]$  vim hive-env.sh

修改如下内容:

# Set HADOOP_HOME to point to a specific hadoop install directory
HADOOP_HOME=/opt/install/hadoop
# Hive Configuration Directory can be controlled by:
export HIVE_CONF_DIR=/opt/install/hive/conf

# Folder containing extra ibraries required for hive compilation/execution can be controlled by:
export HIVE_AUX_JARS_PATH=/opt/install/hive/lib

6.修改hive-log4j.properties

[root@hadoop101 conf]$  mkdir -p /opt/install/hive/logs
[root@hadoop101 conf]$ chown -R hadoop:hadoop /opt/install/hive/logs

重命名hive-log4j.properties (去掉.template)

[root@hadoop101 conf]$  mv hive-log4j.properties.template hive-log4j.properties
[root@hadoop101 conf]$  vim hive-log4j.properties

修改hive-log4j.properties以下内容:

hive.log.dir=/opt/install/hive/logs

7.上传mysql JDBC的jar到hive的lib下,并修改权限

[root@hadoop101 conf]$ cd /opt/install/hive/lib/

找到mysql-connector-java-5.1.44-bin.jar 驱动jar包上传。
8.注意:启动hive之前要先启动mysql
进入mysql,为hive创建一个mysql用户,并且赋予权限
注意修改nodefour为自己的主机名或者映射地址,账号密码为第4步中hive-site.xml 设置的账号密码
格式如下

grant all on *.* to '用户名'@'主机名' identified by '密码';

具体代码

create user 'hive'@'%' identified by 'hive';
grant all on *.* to 'hive'@'nodefour' identified by 'hive';
set password for :=password('hive');
flush privileges; 

9、启动
直接输入hive启动

[root@nodefour jdk-11.0.8]# hive
which: no hbase in (/root/software/jdk-11.0.8/bin:/root/software/jdk-11.0.8/bin:/root/software/jdk-11.0.8/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/opt/install/hadoop/bin:/opt/install/hadoop/sbin:/opt/install/hive/bin:/root/software/apache-maven-3.3.9/bin:/root/bin:/opt/install/hadoop/bin:/opt/install/hadoop/sbin:/opt/install/hive/bin)
20/12/08 17:42:47 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

Logging initialized using configuration in file:/opt/install/hive-1.1.0-cdh5.14.2/conf/hive-log4j.properties
WARNING: Hive CLI is deprecated and migration to Beeline is recommended.
hive> 

错误

报错如下,原因是JDK版本太高,博主当时安装的是jdk11,后来换成jdk8就好了。另外,由于Hadoop中也需要JDK(修改环境变量,jdk11的安装文件位置会直接影响HADOOP的使用),所以博主灵机一动,将解压后的 jdk8更名为jdk11并放在同一文件夹(删除原来)

[root@nodefour conf]# hive
which: no hbase in (/root/software/jdk-11.0.8/bin:/root/software/jdk-11.0.8/bin:/root/software/jdk-11.0.8/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/opt/install/hadoop/bin:/opt/install/hadoop/sbin:/opt/install/hive/bin:/root/software/apache-maven-3.3.9/bin:/root/bin:/opt/install/hadoop/bin:/opt/install/hadoop/sbin:/opt/install/hive/bin)
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.hadoop.security.authentication.util.KerberosUtil (file:/opt/install/hadoop/share/hadoop/common/lib/hadoop-auth-2.6.0-cdh5.14.2.jar) to method sun.security.krb5.Config.getInstance()
WARNING: Please consider reporting this to the maintainers of org.apache.hadoop.security.authentication.util.KerberosUtil
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
20/12/08 17:16:34 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" java.lang.ClassCastException: class jdk.internal.loader.ClassLoaders$AppClassLoader cannot be cast to class java.net.URLClassLoader (jdk.internal.loader.ClassLoaders$AppClassLoader and java.net.URLClassLoader are in module java.base of loader 'bootstrap')
	at org.apache.hadoop.hive.ql.session.SessionState.<init>(SessionState.java:386)
	at org.apache.hadoop.hive.ql.session.SessionState.<init>(SessionState.java:365)
	at org.apache.hadoop.hive.cli.CliSessionState.<init>(CliSessionState.java:60)
	at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:656)
	at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:634)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.base/java.lang.reflect.Method.invoke(Method.java:566)
	at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
	at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值