1.首先要有Hadoop服务全部开启,Hadoop搭建点以下链接
(10条消息) 必成功的Hadoop环境搭建jdk环境搭建-超详细操作_吃土的程序员的博客-优快云博客_hadoop搭建jdk
2.第二点要把hive搭建好,因为链接hive首先要hive没有问题
3.接下来就是ider的操作教程
3.1打开ider
3.2新建项目
3.3新建maven项目点击继续
3.4设置自己的名称与路径点击完成
3.5点击pom文件并添加依赖
3.6将有颜色的部分替换掉,并且将绿色Scala版本换成对应的版本
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.example</groupId>
<artifactId>hive</artifactId>
<version>1.0-SNAPSHOT</version>
<properties>
<maven.compiler.source>8</maven.compiler.source>
<maven.compiler.target>8</maven.compiler.target>
<scala.version>2.11.0</scala.version>
<spark.version>2.2.0</spark.version>
<slf4j.version>1.7.16</slf4j.version>
<log4j.version>1.2.17</log4j.version>
</properties>
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>${scala.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.11</artifactId>
<version>${spark.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>jcl-over-slf4j</artifactId>
<version>${slf4j.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
<version>${slf4j.version}</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>${slf4j.version}</version>
</dependency>
<dependency>
<groupId>log4j</groupId>
<artifactId>log4j</artifactId>
<version>${log4j.version}</version>
</dependency>
<!-- MySQL Client 依赖 -->
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.38</version>
</dependency>
</dependencies>
</project>
3.7点击刷新并开始下载依赖(大概要下载半个小时左右)
3.8点击进入项目结构
3.9点击+号添加Scala SDK
4.0下载Scala版本
4.1选中scala版本并点击ok
4.2 添加hive的文件调用
4.3标红地方要修改为自己的端口和ip,自己的端口可以在虚拟机里Hadoop的配置文件中core-site.xml中看到
<?xml version="1.0" encoding="UTF-8" standalone="no"?> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?> <configuration> <property> <name>hive.exec.scratchdir</name> <value>hdfs://192.168.11.100:8020/user/hive/tmp</value> </property> <property> <name>hive.metastore.warehouse.dir</name> <value>hdfs://192.168.11.100:8020/user/hive/warehouse</value> </property> <property> <name>hive.querylog.location</name> <value>hdfs://192.168.11.100:8020/user/hive/log</value> </property> <property> <name>hive.metastore.uris</name> <value>thrift://192.168.11.100:9083</value> </property> <property> <name>metastore.catalog.default</name> <value>hive</value> </property> <property> <name>javax.jdo.option.ConnectionURL</name> <value>jdbc:mysql://192.168.11.100:3306?createDatabaseIfNotExist=true&characterEncoding=UTF-8&useSSL=false</value> </property> <property> <name>javax.jdo.option.ConnectionDriverName</name> <value>com.mysql.jdbc.Driver</value> </property> <property> <name>javax.jdo.option.ConnectionUserName</name> <value>hive</value> </property> <property> <name>javax.jdo.option.ConnectionPassword</name> <value>hive</value> </property> </configuration>
4.4点击蓝色项目包并新建Scala类,选择object项目(在object可以运行)
4.5等待pom文件中下载好以后,可以通过代码调用连接hive(等待开启hive后运行代码)
import org.apache.spark.SparkConf
import org.apache.spark.sql.SparkSession
import java.util.Properties
object hive1 {
def main(args: Array[String]): Unit = {
val conf = new SparkConf().setMaster("local").setAppName("hive")
//创建sparksession对象
val spark = SparkSession.builder().config(conf).enableHiveSupport().getOrCreate()
spark.sql("show databases").show()
}
}
4.6开启虚拟机并开启Hadoop服务
4.7开启hive服务的监听端口
4.8打开一个新窗口,进入hive
4.9在打开一个新窗口,在/下创建一个/user文件并赋予最高权限
5.0成功连上hive数据库
---------------------------------------------------------------------------------------------------------
以上就算连接hive的过程,也算对自己的一个小总结,还希望分享有帮助!!!