hive gateway(client) configuration

本文档介绍了在配置Hive Gateway过程中遇到的连接Meta Store失败的问题,详细展示了由于GSS初始化失败导致的TTransportException错误,以及后续增加HDFS和YARN Gateway后出现的Kerberos认证问题。通过这些案例,读者可以了解到Hive客户端配置和Kerberos整合的常见问题及其解决方案。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

配置hive gateway机器
Caused by: MetaException(message:Could not connect to meta store using any of the URIs provided. Most recent failure: org.apache.thrift.transport.TTransportException: GSS initiate failed

at org.apache.thrift.transport.TSaslTransport.sendAndThrowMessage(TSaslTransport.java:221)

at org.apache.thrift.transport.TSaslTransport.open(TSaslTransport.java:297)

at org.apache.thrift.transport.TSaslClientTransport.open(TSaslClientTransport.java:37)

at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:52)

at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport$1.run(TUGIAssumingTransport.java:49)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:415)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)

at org.apache.hadoop.hive.thrift.client.TUGIAssumingTransport.open(TUGIAssumingTransport.java:49)

at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:296)

at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:169)

at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)

at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

at java.lang.reflect.Constructor.newInstance(Constructor.java:526)

at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1161)

at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:62)

at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:72)

at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2407)

at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2418)

at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1141)

at org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1130)

at org.apache.hadoop.hive.ql.exec.DDLTask.showTables(DDLTask.java:2250)

at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:334)

at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:151)

at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:65)

at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1485)

at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1263)

at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1091)

at org.apache.hadoop.hive.ql.Driver.run(Driver.java:931)

at org.apache.hadoop.hive.ql.Driver.run(Driver.java:921)

at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:268)

at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)

at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:422)

at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:790)

at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:684)

at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:623)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:606)

at org.apache.hadoop.util.RunJar.main(RunJar.java:212)

)

at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:342)

at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:169)

... 31 more

增加hdfs gateway后
Failed with exception java.io.IOException:java.io.IOException: Can't get Master Kerberos principal for use as renewer

14/11/18 12:39:41 ERROR CliDriver: Failed with exception java.io.IOException:java.io.IOException: Can't get Master Kerberos principal for use as renewer

java.io.IOException: java.io.IOException: Can't get Master Kerberos principal for use as renewer

at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:557)

at org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:495)

at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:139)

at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1578)

at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:280)

at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:220)

at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:422)

at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:790)

at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:684)

at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:623)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:606)

at org.apache.hadoop.util.RunJar.main(RunJar.java:212)

Caused by: java.io.IOException: Can't get Master Kerberos principal for use as renewer

at org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:116)

at org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:100)

at org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodes(TokenCache.java:80)

at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:202)

at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:270)

at org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:386)

at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:521)

增加yarn gateway后解决
--------------------------------------------------------------------------- RuntimeError Traceback (most recent call last) Cell In[1], line 3 1 from pyspark.sql import SparkSession ----> 3 spark = SparkSession.builder \ 4 .appName("HiveConnection") \ 5 .config("hive.metastore.uris", "thrift://<your-metastore-host>:9083") \ 6 .config("spark.sql.hive.metastore.jars", "/path/to/hive/lib/*") \ 7 .config("spark.sql.hive.metastore.version", "3.1.2") \ 8 .config("hive.metastore.client.socket.timeout", "300") \ 9 .config("hive.metastore.client.connect.retry.delay", "5") \ 10 .config("hive.metastore.failure.retries", "10") \ 11 .enableHiveSupport() \ 12 .getOrCreate() File d:\pyspark\Anaconda3\envs\pyspark\lib\site-packages\pyspark\sql\session.py:477, in SparkSession.Builder.getOrCreate(self) 475 sparkConf.set(key, value) 476 # This SparkContext may be an existing one. --> 477 sc = SparkContext.getOrCreate(sparkConf) 478 # Do not update `SparkConf` for existing `SparkContext`, as it's shared 479 # by all sessions. 480 session = SparkSession(sc, options=self._options) File d:\pyspark\Anaconda3\envs\pyspark\lib\site-packages\pyspark\context.py:512, in SparkContext.getOrCreate(cls, conf) 510 with SparkContext._lock: ... --> 106 raise RuntimeError("Java gateway process exited before sending its port number") 108 with open(conn_info_file, "rb") as info: 109 gateway_port = read_int(info) RuntimeError: Java gateway process exited before sending its port number Output is truncated. View as a scrollable element or open in a text editor. Adjust cell output settings...
最新发布
06-24
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值