java.lang.BootstrapMethodError: call site initialization exception错误解决

Flink升级引发的ES数据采集错误及解决方案
在将公司的Flink版本升级后,由于组件兼容性问题,导致ES数据采集时出现`java.lang.BootstrapMethodError`和`LambdaConversionException`。尝试在pom文件中单独引入httpCore和httpClient并排除冲突依赖未解决问题。最终,通过将ES依赖版本从7.11.1回降至7.2.0成功解决了该错误。

问题产生原因:

公司flink版本升级,我们是把任务打成jar包,传到flink集群上运行的,由于flink版本的升级,对组件产生了一定的影响。

问题:

es采集数据时,报了如下错误
java.lang.BootstrapMethodError: call site initialization exception
java.lang.invoke.LambdaConversionException: Invalid receiver type interface org.apache.http.Header; not a subtype of implementation type interface org.apache.http.NameValuePair

bug-fix:

在pom文件中,把httpCore和httpClient单独引入,然后在es的引入中排除冲突的依赖,但是这个方法没有解决我的问题,最后通过降低es依赖的版本解决,从es7.11.1降到es7.2.0

2025-09-12 03:59:46.390+0000 [id=63] INFO jenkins.InitReactorRunner$1#onAttained: System config adapted 2025-09-12 03:59:46.403+0000 [id=47] INFO jenkins.InitReactorRunner$1#onAttained: Loaded all jobs 2025-09-12 03:59:46.411+0000 [id=76] INFO jenkins.InitReactorRunner$1#onAttained: Configuration for all jobs updated 2025-09-12 03:59:46.427+0000 [id=53] INFO jenkins.InitReactorRunner$1#onAttained: Completed initialization 2025-09-12 03:59:46.473+0000 [id=36] INFO hudson.lifecycle.Lifecycle#onReady: Jenkins is fully up and running 2025-09-12 06:23:41.833+0000 [id=882] WARNING hudson.security.csrf.CrumbFilter#doFilter: Found invalid crumb 147b3c82bca9ebd1fa602daaf99e15931ab6f23e4863689a02a1634c924c7fef. If you are calling this URL with a script, please use the API Token instead. More information: https://www.jenkins.io/redirect/crumb-cannot-be-used-for-script 2025-09-12 06:23:41.834+0000 [id=882] WARNING hudson.security.csrf.CrumbFilter#doFilter: No valid crumb was included in request for /view/%E5%87%AF%E5%88%A9/job/%E6%8C%87%E6%8C%A5%E5%A4%A7%E5%B1%8Fapi-pipeline/descriptorByName/org.jenkinsci.plugins.workflow.cps.CpsFlowDefinition/checkScriptCompile by admin. Returning 403. 2025-09-12 09:21:05.113+0000 [id=2548] WARNING j.p.p.BapSshClient#waitForExec: null java.lang.InterruptedException at java.base/java.lang.Object.wait(Native Method) at java.base/java.lang.Thread.join(Unknown Source) at PluginClassLoader for publish-over-ssh//jenkins.plugins.publish_over_ssh.BapSshClient.waitForExec(BapSshClient.java:567) at PluginClassLoader for publish-over-ssh//jenkins.plugins.publish_over_ssh.BapSshClient.exec(BapSshClient.java:481) at PluginClassLoader for publish-over-ssh//jenkins.plugins.publish_over_ssh.BapSshClient.endTransfers(BapSshClient.java:242) at PluginClassLoader for publish-over-ssh//jenkins.plugins.publish_over_ssh.BapSshClient.endTransfers(BapSshClient.java:51) at PluginClassLoader for publish-over//jenkins.plugins.publish_over.BapPublisher$Performer.endTransfers(BapPublisher.java:283) at PluginClassLoader for publish-over//jenkins.plugins.publish_over.BapPublisher$Performer.perform(BapPublisher.java:233) at PluginClassLoader for publish-over//jenkins.plugins.publish_over.BapPublisher$Performer.access$000(BapPublisher.java:205) at PluginClassLoader for publish-over//jenkins.plugins.publish_over.BapPublisher.perform(BapPublisher.java:158) at PluginClassLoader for publish-over//jenkins.plugins.publish_over.BPCallablePublisher.invoke(BPCallablePublisher.java:65) at PluginClassLoader for publish-over//jenkins.plugins.publish_over.BPCallablePublisher.invoke(BPCallablePublisher.java:38) at hudson.FilePath.act(FilePath.java:1210) at hudson.FilePath.act(FilePath.java:1193) at PluginClassLoader for publish-over//jenkins.plugins.publish_over.BPInstanceConfig.perform(BPInstanceConfig.java:141) at PluginClassLoader for publish-over//jenkins.plugins.publish_over.BPPlugin.perform(BPPlugin.java:126) at jenkins.tasks.SimpleBuildStep.perform(SimpleBuildStep.java:123) at PluginClassLoader for workflow-basic-steps//org.jenkinsci.plugins.workflow.steps.CoreStep$Execution.run(CoreStep.java:101) at PluginClassLoader for workflow-basic-steps//org.jenkinsci.plugins.workflow.steps.CoreStep$Execution.run(CoreStep.java:71) at PluginClassLoader for workflow-step-api//org.jenkinsci.plugins.workflow.steps.SynchronousNonBlockingStepExecution.lambda$start$0(SynchronousNonBlockingStepExecution.java:49) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) at java.base/java.util.concurrent.FutureTask.run(Unknown Source) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) at java.base/java.lang.Thread.run(Unknown Source) 2025-09-12 09:21:05.113+0000 [id=2640] WARNING j.p.p.BapSshClient$ExecCheckThread#run: sleep interrupted java.lang.InterruptedException: sleep interrupted at java.base/java.lang.Thread.sleep(Native Method) at PluginClassLoader for publish-over-ssh//jenkins.plugins.publish_over_ssh.BapSshClient$ExecCheckThread.run(BapSshClient.java:592) 2025-09-12 09:21:05.114+0000 [id=2548] WARNING j.p.p.BPCallablePublisher#invoke: Exception when publishing, exception message [Exec timed out or was interrupted after 64,625 ms] jenkins.plugins.publish_over.BapPublisherException: Exec timed out or was interrupted after 64,625 ms at PluginClassLoader for publish-over-ssh//jenkins.plugins.publish_over_ssh.BapSshClient.waitForExec(BapSshClient.java:577) at PluginClassLoader for publish-over-ssh//jenkins.plugins.publish_over_ssh.BapSshClient.exec(BapSshClient.java:481) at PluginClassLoader for publish-over-ssh//jenkins.plugins.publish_over_ssh.BapSshClient.endTransfers(BapSshClient.java:242) at PluginClassLoader for publish-over-ssh//jenkins.plugins.publish_over_ssh.BapSshClient.endTransfers(BapSshClient.java:51) at PluginClassLoader for publish-over//jenkins.plugins.publish_over.BapPublisher$Performer.endTransfers(BapPublisher.java:283) at PluginClassLoader for publish-over//jenkins.plugins.publish_over.BapPublisher$Performer.perform(BapPublisher.java:233) at PluginClassLoader for publish-over//jenkins.plugins.publish_over.BapPublisher$Performer.access$000(BapPublisher.java:205) at PluginClassLoader for publish-over//jenkins.plugins.publish_over.BapPublisher.perform(BapPublisher.java:158) at PluginClassLoader for publish-over//jenkins.plugins.publish_over.BPCallablePublisher.invoke(BPCallablePublisher.java:65) at PluginClassLoader for publish-over//jenkins.plugins.publish_over.BPCallablePublisher.invoke(BPCallablePublisher.java:38) at hudson.FilePath.act(FilePath.java:1210) at hudson.FilePath.act(FilePath.java:1193) at PluginClassLoader for publish-over//jenkins.plugins.publish_over.BPInstanceConfig.perform(BPInstanceConfig.java:141) at PluginClassLoader for publish-over//jenkins.plugins.publish_over.BPPlugin.perform(BPPlugin.java:126) at jenkins.tasks.SimpleBuildStep.perform(SimpleBuildStep.java:123) at PluginClassLoader for workflow-basic-steps//org.jenkinsci.plugins.workflow.steps.CoreStep$Execution.run(CoreStep.java:101) at PluginClassLoader for workflow-basic-steps//org.jenkinsci.plugins.workflow.steps.CoreStep$Execution.run(CoreStep.java:71) at PluginClassLoader for workflow-step-api//org.jenkinsci.plugins.workflow.steps.SynchronousNonBlockingStepExecution.lambda$start$0(SynchronousNonBlockingStepExecution.java:49) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) at java.base/java.util.concurrent.FutureTask.run(Unknown Source) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) at java.base/java.lang.Thread.run(Unknown Source) 2025-09-12 09:21:05.115+0000 [id=2548] WARNING j.p.p.BPInstanceConfig#perform: An exception was caught when invoking perform jenkins.plugins.publish_over.BapPublisherException: Exec timed out or was interrupted after 64,625 ms at PluginClassLoader for publish-over-ssh//jenkins.plugins.publish_over_ssh.BapSshClient.waitForExec(BapSshClient.java:577) at PluginClassLoader for publish-over-ssh//jenkins.plugins.publish_over_ssh.BapSshClient.exec(BapSshClient.java:481) at PluginClassLoader for publish-over-ssh//jenkins.plugins.publish_over_ssh.BapSshClient.endTransfers(BapSshClient.java:242) at PluginClassLoader for publish-over-ssh//jenkins.plugins.publish_over_ssh.BapSshClient.endTransfers(BapSshClient.java:51) at PluginClassLoader for publish-over//jenkins.plugins.publish_over.BapPublisher$Performer.endTransfers(BapPublisher.java:283) at PluginClassLoader for publish-over//jenkins.plugins.publish_over.BapPublisher$Performer.perform(BapPublisher.java:233) at PluginClassLoader for publish-over//jenkins.plugins.publish_over.BapPublisher$Performer.access$000(BapPublisher.java:205) at PluginClassLoader for publish-over//jenkins.plugins.publish_over.BapPublisher.perform(BapPublisher.java:158) at PluginClassLoader for publish-over//jenkins.plugins.publish_over.BPCallablePublisher.invoke(BPCallablePublisher.java:65) Caused: jenkins.plugins.publish_over.BapPublisherException: Exception when publishing, exception message [Exec timed out or was interrupted after 64,625 ms] at PluginClassLoader for publish-over//jenkins.plugins.publish_over.BPCallablePublisher.invoke(BPCallablePublisher.java:69) at PluginClassLoader for publish-over//jenkins.plugins.publish_over.BPCallablePublisher.invoke(BPCallablePublisher.java:38) at hudson.FilePath.act(FilePath.java:1210) at hudson.FilePath.act(FilePath.java:1193) at PluginClassLoader for publish-over//jenkins.plugins.publish_over.BPInstanceConfig.perform(BPInstanceConfig.java:141) at PluginClassLoader for publish-over//jenkins.plugins.publish_over.BPPlugin.perform(BPPlugin.java:126) at jenkins.tasks.SimpleBuildStep.perform(SimpleBuildStep.java:123) at PluginClassLoader for workflow-basic-steps//org.jenkinsci.plugins.workflow.steps.CoreStep$Execution.run(CoreStep.java:101) at PluginClassLoader for workflow-basic-steps//org.jenkinsci.plugins.workflow.steps.CoreStep$Execution.run(CoreStep.java:71) at PluginClassLoader for workflow-step-api//org.jenkinsci.plugins.workflow.steps.SynchronousNonBlockingStepExecution.lambda$start$0(SynchronousNonBlockingStepExecution.java:49) at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) at java.base/java.util.concurrent.FutureTask.run(Unknown Source) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) at java.base/java.lang.Thread.run(Unknown Source) # # A fatal error has been detected by the Java Runtime Environment: # # SIGSEGV (0xb) at pc=0x00007604e4b3ec9f, pid=7, tid=170 # # JRE version: OpenJDK Runtime Environment Temurin-17.0.16+8 (17.0.16+8) (build 17.0.16+8) # Java VM: OpenJDK 64-Bit Server VM Temurin-17.0.16+8 (17.0.16+8, mixed mode, tiered, compressed oops, compressed class ptrs, g1 gc, linux-amd64) # Problematic frame: # V [libjvm.so+0x83dc9f] java_lang_Throwable::fill_in_stack_trace(Handle, methodHandle const&, JavaThread*)+0x95f # # Core dump will be written. Default location: Core dumps may be processed with "/usr/share/apport/apport -p%p -s%s -c%c -d%d -P%P -u%u -g%g -F%F -- %E" (or dumping to //core.7) # # An error report file with more information is saved as: # /tmp/hs_err_pid7.log # # If you would like to submit a bug report, please visit: # https://github.com/adoptium/adoptium-support/issues # Running from: /usr/share/jenkins/jenkins.war webroot: /var/jenkins_home/war 2025-09-12 09:22:52.973+0000 [id=1] INFO winstone.Logger#logInternal: Beginning extraction from war file 2025-09-12 09:22:53.005+0000 [id=1] WARNING o.e.j.ee9.nested.ContextHandler#setContextPath: Empty contextPath 2025-09-12 09:22:53.025+0000 [id=1] INFO org.eclipse.jetty.server.Server#doStart: jetty-12.0.22; built: 2025-06-02T15:25:31.946Z; git: 335c9ab44a5591f0ea941bf350e139b8c4f5537c; jvm 17.0.16+8 2025-09-12 09:22:53.151+0000 [id=1] INFO o.e.j.e.w.StandardDescriptorProcessor#visitServlet: NO JSP Support for /, did not find org.eclipse.jetty.ee9.jsp.JettyJspServlet 2025-09-12 09:22:53.168+0000 [id=1] INFO o.e.j.s.DefaultSessionIdManager#doStart: Session workerName=node0 2025-09-12 09:22:53.313+0000 [id=1] INFO hudson.WebAppMain#contextInitialized: Jenkins home directory: /var/jenkins_home found at: EnvVars.masterEnvVars.get("JENKINS_HOME") 2025-09-12 09:22:53.347+0000 [id=1] INFO o.e.j.s.handler.ContextHandler#doStart: Started oeje9n.ContextHandler$CoreContextHandler@1813f3e9{Jenkins v2.516.2,/,b=file:///var/jenkins_home/war/,a=AVAILABLE,h=oeje9n.ContextHandler$CoreContextHandler$CoreToNestedHandler@28cb9120{STARTED}} 2025-09-12 09:22:53.353+0000 [id=1] INFO o.e.j.server.AbstractConnector#doStart: Started ServerConnector@72458efc{HTTP/1.1, (http/1.1)}{0.0.0.0:8080} 2025-09-12 09:22:53.357+0000 [id=1] INFO org.eclipse.jetty.server.Server#doStart: Started oejs.Server@b9b00e0{STARTING}[12.0.22,sto=0] @603ms 2025-09-12 09:22:53.357+0000 [id=43] INFO winstone.Logger#logInternal: Winstone Servlet Engine running: controlPort=disabled 2025-09-12 09:22:53.400+0000 [id=36] INFO jenkins.model.Jenkins#<init>: Starting version 2.516.2 2025-09-12 09:22:53.430+0000 [id=50] INFO jenkins.InitReactorRunner$1#onAttained: Started initialization 2025-09-12 09:22:53.490+0000 [id=79] INFO jenkins.InitReactorRunner$1#onAttained: Listed all plugins 2025-09-12 09:22:54.524+0000 [id=108] INFO jenkins.InitReactorRunner$1#onAttained: Prepared all plugins 2025-09-12 09:22:54.531+0000 [id=65] INFO jenkins.InitReactorRunner$1#onAttained: Started all plugins 2025-09-12 09:22:54.532+0000 [id=65] INFO jenkins.InitReactorRunner$1#onAttained: Augmented all extensions 2025-09-12 09:22:54.725+0000 [id=27] INFO h.p.b.g.GlobalTimeOutConfiguration#load: global timeout not set 2025-09-12 09:22:55.091+0000 [id=94] INFO jenkins.InitReactorRunner$1#onAttained: System config loaded 2025-09-12 09:22:55.092+0000 [id=50] INFO jenkins.InitReactorRunner$1#onAttained: System config adapted 2025-09-12 09:22:55.114+0000 [id=95] INFO jenkins.InitReactorRunner$1#onAttained: Loaded all jobs 2025-09-12 09:22:55.115+0000 [id=69] INFO jenkins.InitReactorRunner$1#onAttained: Configuration for all jobs updated 2025-09-12 09:22:55.139+0000 [id=83] INFO jenkins.InitReactorRunner$1#onAttained: Completed initialization 2025-09-12 09:22:55.204+0000 [id=36] INFO hudson.lifecycle.Lifecycle#onReady: Jenkins is fully up and running
09-13
Hive Session ID = 9cb9a1b7-4662-4f04-b8cd-406086f9b633 Exception in thread "main" org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:108) at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:224) at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:146) at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:140) at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:54) at org.apache.spark.sql.hive.HiveSessionStateBuilder.$anonfun$catalog$1(HiveSessionStateBuilder.scala:69) at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog$lzycompute(SessionCatalog.scala:123) at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog(SessionCatalog.scala:123) at org.apache.spark.sql.catalyst.catalog.SessionCatalog.listDatabases(SessionCatalog.scala:325) at org.apache.spark.sql.execution.datasources.v2.V2SessionCatalog.listNamespaces(V2SessionCatalog.scala:267) at org.apache.spark.sql.execution.datasources.v2.ShowNamespacesExec.run(ShowNamespacesExec.scala:42) at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:43) at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result(V2CommandExec.scala:43) at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.executeCollect(V2CommandExec.scala:49) at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.$anonfun$applyOrElse$1(QueryExecution.scala:107) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$6(SQLExecution.scala:125) at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:201) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:108) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:900) at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:66) at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:107) at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:98) at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:461) at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:76) at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:461) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:32) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:32) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:32) at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:437) at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:98) at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:85) at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:83) at org.apache.spark.sql.Dataset.<init>(Dataset.scala:220) at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:100) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:900) at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:97) at org.apache.spark.sql.SparkSession.$anonfun$sql$4(SparkSession.scala:691) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:900) at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:682) at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:713) at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:744) at DDL_hive.main(DDL_hive.java:29) Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1666) at org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1651) at org.apache.spark.sql.hive.client.Shim_v0_12.databaseExists(HiveShim.scala:609) at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$databaseExists$1(HiveClientImpl.scala:407) at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23) at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$withHiveState$1(HiveClientImpl.scala:304) at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:235) at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:234) at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:284) at org.apache.spark.sql.hive.client.HiveClientImpl.databaseExists(HiveClientImpl.scala:407) at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$databaseExists$1(HiveExternalCatalog.scala:224) at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23) at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:99) ... 43 more Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient at org.apache.hadoop.hive.metastore.utils.JavaUtils.newInstance(JavaUtils.java:86) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:95) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:148) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:119) at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:4306) at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:4374) at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:4354) at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1662) ... 55 more Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.hive.metastore.utils.JavaUtils.newInstance(JavaUtils.java:84) ... 62 more Caused by: MetaException(message:Could not connect to meta store using any of the URIs provided. Most recent failure: org.apache.thrift.transport.TTransportException: java.net.ConnectException: Connection refused: connect at org.apache.thrift.transport.TSocket.open(TSocket.java:226) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:516) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:224) at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:94) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.hive.metastore.utils.JavaUtils.newInstance(JavaUtils.java:84) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:95) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:148) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:119) at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:4306) at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:4374) at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:4354) at org.apache.hadoop.hive.ql.metadata.Hive.getDatabase(Hive.java:1662) at org.apache.hadoop.hive.ql.metadata.Hive.databaseExists(Hive.java:1651) at org.apache.spark.sql.hive.client.Shim_v0_12.databaseExists(HiveShim.scala:609) at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$databaseExists$1(HiveClientImpl.scala:407) at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23) at org.apache.spark.sql.hive.client.HiveClientImpl.$anonfun$withHiveState$1(HiveClientImpl.scala:304) at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:235) at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:234) at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:284) at org.apache.spark.sql.hive.client.HiveClientImpl.databaseExists(HiveClientImpl.scala:407) at org.apache.spark.sql.hive.HiveExternalCatalog.$anonfun$databaseExists$1(HiveExternalCatalog.scala:224) at scala.runtime.java8.JFunction0$mcZ$sp.apply(JFunction0$mcZ$sp.java:23) at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:99) at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:224) at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:146) at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:140) at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:54) at org.apache.spark.sql.hive.HiveSessionStateBuilder.$anonfun$catalog$1(HiveSessionStateBuilder.scala:69) at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog$lzycompute(SessionCatalog.scala:123) at org.apache.spark.sql.catalyst.catalog.SessionCatalog.externalCatalog(SessionCatalog.scala:123) at org.apache.spark.sql.catalyst.catalog.SessionCatalog.listDatabases(SessionCatalog.scala:325) at org.apache.spark.sql.execution.datasources.v2.V2SessionCatalog.listNamespaces(V2SessionCatalog.scala:267) at org.apache.spark.sql.execution.datasources.v2.ShowNamespacesExec.run(ShowNamespacesExec.scala:42) at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result$lzycompute(V2CommandExec.scala:43) at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.result(V2CommandExec.scala:43) at org.apache.spark.sql.execution.datasources.v2.V2CommandExec.executeCollect(V2CommandExec.scala:49) at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.$anonfun$applyOrElse$1(QueryExecution.scala:107) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$6(SQLExecution.scala:125) at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:201) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:108) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:900) at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:66) at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:107) at org.apache.spark.sql.execution.QueryExecution$$anonfun$eagerlyExecuteCommands$1.applyOrElse(QueryExecution.scala:98) at org.apache.spark.sql.catalyst.trees.TreeNode.$anonfun$transformDownWithPruning$1(TreeNode.scala:461) at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(origin.scala:76) at org.apache.spark.sql.catalyst.trees.TreeNode.transformDownWithPruning(TreeNode.scala:461) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.org$apache$spark$sql$catalyst$plans$logical$AnalysisHelper$$super$transformDownWithPruning(LogicalPlan.scala:32) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning(AnalysisHelper.scala:267) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.transformDownWithPruning$(AnalysisHelper.scala:263) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:32) at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.transformDownWithPruning(LogicalPlan.scala:32) at org.apache.spark.sql.catalyst.trees.TreeNode.transformDown(TreeNode.scala:437) at org.apache.spark.sql.execution.QueryExecution.eagerlyExecuteCommands(QueryExecution.scala:98) at org.apache.spark.sql.execution.QueryExecution.commandExecuted$lzycompute(QueryExecution.scala:85) at org.apache.spark.sql.execution.QueryExecution.commandExecuted(QueryExecution.scala:83) at org.apache.spark.sql.Dataset.<init>(Dataset.scala:220) at org.apache.spark.sql.Dataset$.$anonfun$ofRows$2(Dataset.scala:100) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:900) at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:97) at org.apache.spark.sql.SparkSession.$anonfun$sql$4(SparkSession.scala:691) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:900) at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:682) at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:713) at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:744) at DDL_hive.main(DDL_hive.java:29) Caused by: java.net.ConnectException: Connection refused: connect at java.net.DualStackPlainSocketImpl.waitForConnect(Native Method) at java.net.DualStackPlainSocketImpl.socketConnect(DualStackPlainSocketImpl.java:85) at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350) at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206) at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188) at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:172) at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392) at java.net.Socket.connect(Socket.java:589) at org.apache.thrift.transport.TSocket.open(TSocket.java:221) ... 70 more ) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.open(HiveMetaStoreClient.java:565) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:224) at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:94) ... 67 more 11:05:58.838 [shutdown-hook-0] ERROR org.apache.spark.util.ShutdownHookManager - Exception while deleting Spark temp dir: C:\Users\liguanghui\AppData\Local\Temp\hive-v3_1-464320dd-c24a-44ba-a840-87e23193a43a java.io.IOException: Failed to delete: C:\Users\liguanghui\AppData\Local\Temp\hive-v3_1-464320dd-c24a-44ba-a840-87e23193a43a\stax_stax-api-1.0.1.jar at org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:147) ~[spark-common-utils_2.12-3.5.6.jar:3.5.6] at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:117) ~[spark-common-utils_2.12-3.5.6.jar:3.5.6] at org.apache.spark.network.util.JavaUtils.deleteRecursivelyUsingJavaIO(JavaUtils.java:130) ~[spark-common-utils_2.12-3.5.6.jar:3.5.6] at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:117) ~[spark-common-utils_2.12-3.5.6.jar:3.5.6] at org.apache.spark.network.util.JavaUtils.deleteRecursively(JavaUtils.java:90) ~[spark-common-utils_2.12-3.5.6.jar:3.5.6] at org.apache.spark.util.SparkFileUtils.deleteRecursively(SparkFileUtils.scala:121) ~[spark-common-utils_2.12-3.5.6.jar:3.5.6] at org.apache.spark.util.SparkFileUtils.deleteRecursively$(SparkFileUtils.scala:120) ~[spark-common-utils_2.12-3.5.6.jar:3.5.6] at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:1126) ~[spark-core_2.12-3.5.6.jar:3.5.6] at org.apache.spark.util.ShutdownHookManager$.$anonfun$new$4(ShutdownHookManager.scala:65) ~[spark-core_2.12-3.5.6.jar:3.5.6] at org.apache.spark.util.ShutdownHookManager$.$anonfun$new$4$adapted(ShutdownHookManager.scala:62) ~[spark-core_2.12-3.5.6.jar:3.5.6] at scala.collection.IndexedSeqOptimized.foreach(IndexedSeqOptimized.scala:36) ~[scala-library-2.12.18.jar:?] at scala.collection.IndexedSeqOptimized.foreach$(IndexedSeqOptimized.scala:33) ~[scala-library-2.12.18.jar:?] at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:198) ~[scala-library-2.12.18.jar:?] at org.apache.spark.util.ShutdownHookManager$.$anonfun$new$2(ShutdownHookManager.scala:62) ~[spark-core_2.12-3.5.6.jar:3.5.6] at org.apache.spark.util.SparkShutdownHook.run(ShutdownHookManager.scala:214) ~[spark-core_2.12-3.5.6.jar:3.5.6] at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$2(ShutdownHookManager.scala:188) ~[spark-core_2.12-3.5.6.jar:3.5.6] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) ~[scala-library-2.12.18.jar:?] at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1928) ~[spark-core_2.12-3.5.6.jar:3.5.6] at org.apache.spark.util.SparkShutdownHookManager.$anonfun$runAll$1(ShutdownHookManager.scala:188) ~[spark-core_2.12-3.5.6.jar:3.5.6] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) [scala-library-2.12.18.jar:?] at scala.util.Try$.apply(Try.scala:213) [scala-library-2.12.18.jar:?] at org.apache.spark.util.SparkShutdownHookManager.runAll(ShutdownHookManager.scala:188) [spark-core_2.12-3.5.6.jar:3.5.6] at org.apache.spark.util.SparkShutdownHookManager$$anon$2.run(ShutdownHookManager.scala:178) [spark-core_2.12-3.5.6.jar:3.5.6] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [?:1.8.0_151] at java.util.concurrent.FutureTask.run(FutureTask.java:266) [?:1.8.0_151] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_151] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_151] at java.lang.Thread.run(Thread.java:748) [?:1.8.0_151]
最新发布
11-10
评论 3
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值