hiveserver2&&beeline&&java client

本文介绍如何通过 Java JDBC 连接 HiveServer2 并执行 SQL 查询。包括启动 HiveServer2 和 Beeline 的命令,以及一个完整的 Java 示例程序,演示了创建表、加载数据、执行查询等操作。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

  1. hiveserver2
    -》启动
    bin/hiveserver2:前段运行
    bin/hiveserver2 & :后台运行
    bin/hive --service hiveserver2

  2. beeline(先启动hiveserver2)
    -》启动
    bin/beeline
    bin/beeline -u jdbc:hive2://hadoop01.com:10000
    -n lm -p 123456

  3. java client 连接
    由于这个和j d b c差不多,所以可以参考官网,但是由于官网访问速度慢,我吧官网的内容直接拷贝下来,留作使用和查看。


import java.sql.SQLException;
import java.sql.Connection;
import java.sql.ResultSet;
import java.sql.Statement;
import java.sql.DriverManager;
 
public class HiveJdbcClient {
  private static String driverName = "org.apache.hive.jdbc.HiveDriver";
 
  /**
   * @param args
   * @throws SQLException
   */
  public static void main(String[] args) throws SQLException {
      try {
      Class.forName(driverName);
    } catch (ClassNotFoundException e) {
      // TODO Auto-generated catch block
      e.printStackTrace();
      System.exit(1);
    }
    //replace "hive" here with the name of the user the queries should run as
    Connection con = DriverManager.getConnection("jdbc:hive2://localhost:10000/default", "hive", "");
    Statement stmt = con.createStatement();
    String tableName = "testHiveDriverTable";
    stmt.execute("drop table if exists " + tableName);
    stmt.execute("create table " + tableName + " (key int, value string)");
    // show tables
    String sql = "show tables '" + tableName + "'";
    System.out.println("Running: " + sql);
    ResultSet res = stmt.executeQuery(sql);
    if (res.next()) {
      System.out.println(res.getString(1));
    }
       // describe table
    sql = "describe " + tableName;
    System.out.println("Running: " + sql);
    res = stmt.executeQuery(sql);
    while (res.next()) {
      System.out.println(res.getString(1) + "\t" + res.getString(2));
    }
 
    // load data into table
    // NOTE: filepath has to be local to the hive server
    // NOTE: /tmp/a.txt is a ctrl-A separated file with two fields per line
    String filepath = "/tmp/a.txt";
    sql = "load data local inpath '" + filepath + "' into table " + tableName;
    System.out.println("Running: " + sql);
    stmt.execute(sql);
 
    // select * query
    sql = "select * from " + tableName;
    System.out.println("Running: " + sql);
    res = stmt.executeQuery(sql);
    while (res.next()) {
      System.out.println(String.valueOf(res.getInt(1)) + "\t" + res.getString(2));
    }
 
    // regular hive query
    sql = "select count(1) from " + tableName;
    System.out.println("Running: " + sql);
    res = stmt.executeQuery(sql);
    while (res.next()) {
      System.out.println(res.getString(1));
    }
  }
}
Running the JDBC Sample Code
# Then on the command-line
$ javac HiveJdbcClient.java
 
# To run the program using remote hiveserver in non-kerberos mode, we need the following jars in the classpath
# from hive/build/dist/lib
#     hive-jdbc*.jar
#     hive-service*.jar
#     libfb303-0.9.0.jar
#     libthrift-0.9.0.jar
#     log4j-1.2.16.jar
#     slf4j-api-1.6.1.jar
#     slf4j-log4j12-1.6.1.jar
#     commons-logging-1.0.4.jar
#
#
# To run the program using kerberos secure mode, we need the following jars in the classpath
#     hive-exec*.jar
#     commons-configuration-1.6.jar (This is not needed with Hadoop 2.6.x and later).
#  and from hadoop
#     hadoop-core*.jar (use hadoop-common*.jar for Hadoop 2.x)
#
# To run the program in embedded mode, we need the following additional jars in the classpath
# from hive/build/dist/lib
#     hive-exec*.jar
#     hive-metastore*.jar
#     antlr-runtime-3.0.1.jar
#     derby.jar
#     jdo2-api-2.1.jar
#     jpox-core-1.2.2.jar
#     jpox-rdbms-1.2.2.jar
# and from hadoop/build
#     hadoop-core*.jar
# as well as hive/build/dist/conf, any HIVE_AUX_JARS_PATH set, 
# and hadoop jars necessary to run MR jobs (eg lzo codec)
 
$ java -cp $CLASSPATH HiveJdbcClient
Alternatively, you can run the following bash script, which will seed the data file and build your classpath before invoking the client. The script adds all the additional jars needed for using HiveServer2 in embedded mode as well.
#!/bin/bash
HADOOP_HOME=/your/path/to/hadoop
HIVE_HOME=/your/path/to/hive
 
echo -e '1\x01foo' > /tmp/a.txt
echo -e '2\x01bar' >> /tmp/a.txt
 
HADOOP_CORE=$(ls $HADOOP_HOME/hadoop-core*.jar)
CLASSPATH=.:$HIVE_HOME/conf:$(hadoop classpath)
 
for i in ${HIVE_HOME}/lib/*.jar ; do
    CLASSPATH=$CLASSPATH:$i
done
 
java -cp $CLASSPATH HiveJdbcClient
https://172.16.10.10/yoyobox/branches/mobobox-57/YOYOBoxExtra
beeline> !connect jdbc:hive2://node1:10000 Connecting to jdbc:hive2://node1:10000 Enter username for jdbc:hive2://node1:10000: hadoop Enter password for jdbc:hive2://node1:10000: 25/07/20 16:07:35 [main]: ERROR jdbc.HiveConnection: Error opening session org.apache.thrift.transport.TTransportException: null at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132) ~[hive-exec-3.1.3.jar:3.1.3] at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) ~[hive-exec-3.1.3.jar:3.1.3] at org.apache.thrift.transport.TSaslTransport.readLength(TSaslTransport.java:376) ~[hive-exec-3.1.3.jar:3.1.3] at org.apache.thrift.transport.TSaslTransport.readFrame(TSaslTransport.java:453) ~[hive-exec-3.1.3.jar:3.1.3] at org.apache.thrift.transport.TSaslTransport.read(TSaslTransport.java:435) ~[hive-exec-3.1.3.jar:3.1.3] at org.apache.thrift.transport.TSaslClientTransport.read(TSaslClientTransport.java:37) ~[hive-exec-3.1.3.jar:3.1.3] at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86) ~[hive-exec-3.1.3.jar:3.1.3] at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429) ~[hive-exec-3.1.3.jar:3.1.3] at org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318) ~[hive-exec-3.1.3.jar:3.1.3] at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219) ~[hive-exec-3.1.3.jar:3.1.3] at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:77) ~[hive-exec-3.1.3.jar:3.1.3] at org.apache.hive.service.rpc.thrift.TCLIService$Client.recv_OpenSession(TCLIService.java:176) ~[hive-exec-3.1.3.jar:3.1.3] at org.apache.hive.service.rpc.thrift.TCLIService$Client.OpenSession(TCLIService.java:163) ~[hive-exec-3.1.3.jar:3.1.3] at org.apache.hive.jdbc.HiveConnection.openSession(HiveConnection.java:728) [hive-jdbc-3.1.3.jar:3.1.3] at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:232) [hive-jdbc-3.1.3.jar:3.1.3] at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:107) [hive-jdbc-3.1.3.jar:3.1.3] at java.sql.DriverManager.getConnection(DriverManager.java:664) [?:1.8.0_361] at java.sql.DriverManager.getConnection(DriverManager.java:208) [?:1.8.0_361] at org.apache.hive.beeline.DatabaseConnection.connect(DatabaseConnection.java:145) [hive-beeline-3.1.3.jar:3.1.3] at org.apache.hive.beeline.DatabaseConnection.getConnection(DatabaseConnection.java:209) [hive-beeline-3.1.3.jar:3.1.3] at org.apache.hive.beeline.Commands.connect(Commands.java:1641) [hive-beeline-3.1.3.jar:3.1.3] at org.apache.hive.beeline.Commands.connect(Commands.java:1536) [hive-beeline-3.1.3.jar:3.1.3] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_361] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_361] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_361] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_361] at org.apache.hive.beeline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:56) [hive-beeline-3.1.3.jar:3.1.3] at org.apache.hive.beeline.BeeLine.execCommandWithPrefix(BeeLine.java:1384) [hive-beeline-3.1.3.jar:3.1.3] at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:1423) [hive-beeline-3.1.3.jar:3.1.3] at org.apache.hive.beeline.BeeLine.execute(BeeLine.java:1287) [hive-beeline-3.1.3.jar:3.1.3] at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:1071) [hive-beeline-3.1.3.jar:3.1.3] at org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:538) [hive-beeline-3.1.3.jar:3.1.3] at org.apache.hive.beeline.BeeLine.main(BeeLine.java:520) [hive-beeline-3.1.3.jar:3.1.3] at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_361] at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:1.8.0_361] at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_361] at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_361] at org.apache.hadoop.util.RunJar.run(RunJar.java:330) [hadoop-common-3.4.0.jar:?] at org.apache.hadoop.util.RunJar.main(RunJar.java:245) [hadoop-common-3.4.0.jar:?] 25/07/20 16:07:35 [main]: WARN jdbc.HiveConnection: Failed to connect to node1:10000 Error: Could not open client transport with JDBC Uri: jdbc:hive2://node1:10000: Could not establish connection to jdbc:hive2://node1:10000: null (state=08S01,code=0) beeline> 这个怎么解决
07-21
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值