hive on spark配置过程中报错

2023-05-23 11:11:25,356 ERROR [d107e52b-9f8e-437b-93f7-724b24826848 main] client.SparkClientImpl (SparkClientImpl.java:<init>(120)) - Timed out waiting for client to connect.
Possible reasons include network issues, errors in remote driver or the cluster has no available resources, etc.
Please check YARN or Spark driver's logs for further information.
java.util.concurrent.ExecutionException: java.util.concurrent.TimeoutException: Timed out waiting for client connection.
    at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:41)
    at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:106)
    at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:88)
    at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.createRemoteClient(RemoteHiveSparkClient.java:105)
    at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:101)
    at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:76)
    at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:87)
    at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:115)
    at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:136)
    at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:115)
    at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:205)
    at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97)
    at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2664)
    at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2335)
    at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2011)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1709)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1703)
    at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157)
    at org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:218)
    at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:402)
    at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:821)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:759)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:318)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:232)
Caused by: java.util.concurrent.TimeoutException: Timed out waiting for client connection.
    at org.apache.hive.spark.client.rpc.RpcServer$2.run(RpcServer.java:172)
    at io.netty.util.concurrent.PromiseTask$RunnableAdapter.call(PromiseTask.java:38)
    at io.netty.util.concurrent.ScheduledFutureTask.run(ScheduledFutureTask.java:120)
    at io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:403)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:463)
    at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)

去hive-stie.xml中添加如下配置

<property>
        <name>hive.spark.client.connect.timeout</name>
        <value>10000</value>
</property>
<property>
        <name>hive.spark.client.server.connect.timeout</name>
        <value>90000</value>
</property>
 

还有原因可能是版本不匹配

去hive的lib目录下查看

[root@hadoop102 lib]# ls | grep spark
hive-spark-client-3.1.2.jar
spark-core_2.12-3.0.0.jar
spark-kvstore_2.12-3.0.0.jar
spark-launcher_2.12-3.0.0.jar
spark-network-common_2.12-3.0.0.jar
spark-network-shuffle_2.12-3.0.0.jar
spark-tags_2.12-3.0.0.jar
spark-unsafe_2.12-3.0.0.jar

spark纯净版本和带hadoop依赖的jar包这里都必须是3.0.0

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值