hive环境的.bashrc文件

本文深入探讨了如何在每次执行Hive SQL查询之前,通过在特定配置文件中添加命令实现自动化操作,以提高数据处理效率。主要关注.hiverc文件的应用,解释其位置变化,并提供实例指导。
hive有类.bashrc的文件.hiverc。似如果需要在每次执行hql之前执行一些命令,可以写在$HIVE_CONF_DIR/.hiverc文件中。hive 0.10之前,这个文件文件位于$HIVE_HOME/bin/。
which: no hbase in (/export/servers/hadoop-3.1.3/bin:/export/servers/hadoop-3.1.3/sbin:/export/servers/apache-hive-3.1.3-bin/bin:/export/servers/hadoop-3.1.3/bin:/export/servers/hadoop-3.1.3/sbin:/export/servers/apache-hive-3.1.3-bin/bin:/export/servers/hadoop-3.1.3/bin:/export/servers/hadoop-3.1.3/sbin::/export/servers/hadoop-3.1.3/bin:/export/servers/hadoop-3.1.3/sbin:/usr/local/bin:/usr/local/sbin:/usr/bin:/usr/sbin:/bin:/sbin:/export/servers/jdk/bin:/root/bin) SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/export/servers/hadoop-3.1.3/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/export/servers/apache-hive-3.1.3-bin/lib/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] Hive Session ID = a4cf2a4b-e071-4662-a486-37fd78804cf9 Logging initialized using configuration in file:/export/servers/apache-hive-3.1.3-bin/conf/hive-log4j2.properties Async: true 0 [a4cf2a4b-e071-4662-a486-37fd78804cf9 main] WARN org.apache.hadoop.hive.metastore.ObjectStore - datanucleus.autoStartMechanismMode is set to unsupported value null . Setting it to value: ignored 357 [a4cf2a4b-e071-4662-a486-37fd78804cf9 main] WARN com.zaxxer.hikari.util.DriverDataSource - Registered driver with driverClassName=org.apache.derby.jdbc.EmbeddedDriver was not found, trying direct instantiation. 545 [a4cf2a4b-e071-4662-a486-37fd78804cf9 main] WARN com.zaxxer.hikari.util.DriverDataSource - Registered driver with driverClassName=org.apache.derby.jdbc.EmbeddedDriver was not found, trying direct instantiation. 837 [a4cf2a4b-e071-4662-a486-37fd78804cf9 main] WARN org.apache.hadoop.hive.metastore.MetaStoreDirectSql - Self-test query [select "DB_ID" from "DBS"] failed; direct SQL is disabled javax.jdo.JDODataStoreException: Error executing SQL query "select "DB_ID" from "DBS"". at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:543) at org.datanucleus.api.jdo.JDOQuery.executeInternal(JDOQuery.java:391) at org.datanucleus.api.jdo.JDOQuery.execute(JDOQuery.java:216) at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.runTestQuery(MetaStoreDirectSql.java:276) at org.apache.hadoop.hive.metastore.MetaStoreDirectSql.<init>(MetaStoreDirectSql.java:184) at org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper(ObjectStore.java:499) at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:421) at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:376) at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:77) at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:137) at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:59) at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStoreForConf(HiveMetaStore.java:720) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMSForConf(HiveMetaStore.java:698) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:692) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:769) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:540) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invokeInternal(RetryingHMSHandler.java:147) at org.apache.hadoop.hive.metastore.RetryingHMSHandler.invoke(RetryingHMSHandler.java:108) at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:80) at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:93) at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:8678) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:169) at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:94) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.hive.metastore.utils.JavaUtils.newInstance(JavaUtils.java:84) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:95) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:148) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:119) at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:4306) at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:4374) at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:4354) at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:4610) at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:291) at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:274) at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:442) at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:382) at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:362) at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:331) at org.apache.hadoop.hive.ql.metadata.HiveMaterializedViewsRegistry.init(HiveMaterializedViewsRegistry.java:133) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:755) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.util.RunJar.run(RunJar.java:318) at org.apache.hadoop.util.RunJar.main(RunJar.java:232) NestedThrowablesStackTrace:为什么会出现这些,怎么解决
最新发布
11-29
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值