hive Permission denied: user=anonymous, access=WRITE

本文解决Hive与Spark配置中常见的几个问题,包括找不到spark-assembly.jar、hive启动时URISyntaxException错误、hive命令行显示表格失败以及通过JDBC连接hive时的权限问题。提供了详细的解决步骤,如修改hive.sh中的jar路径、调整hive-site.xml配置等。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

转载文章:http://blog.sina.com.cn/s/blog_c0dd8b4e0102ycsh.html

/usr/hive/warehouse 

问题一:


运行hive.sh或者hiveserver2.sh文件时报错:ls: 无法访问/home/asus/spark/lib/spark-assembly-*.jar: 没有那个文件或目录

原因:
hive.sh中有段代码如下

hive数据库常见问题总结
 

本地的spark版本是2.0,而自从spark升级到2.0.0之后,原有的lib的整个大JAR包已经被分散的小JAR包的替代,所以没有办法找到这个spark-assembly的JAR包。这就是问题所在。


解决方法:
将这个spark-assembly-*.jar`替换成jars/*.jar,就不会出现这样的问题。

 

问题一:

启动hive时报错:Caused by: java.net.URISyntaxException: Relative path in absolute URI: ${system:java.io.tmpdir}/${system:user.name}

解决方法:
1.查看hive-site.xml配置,会看到配置值含有"system:java.io.tmpdir"的配置项
2.hive下新建文件夹iotmp
3.将含有"${system:java.io.tmpdir}"的配置项的值修改为如上地址 /home/asus/hive/iotmp
   再次启动hive,成功!

问题二:

在hive命令行执行下列命令时出错:
hive> show tables;
报错:
Failed with exception java.io.IOException:java.lang.IllegalArgumentException: java.net.URISyntaxException: Relative path in absolute URI: ${system:user.name}

解决方法:
打开hive-site.xml,将 ${ system:user.name} 改为 ${user.name},重启hive 即可解决。
 

问题四:

jdbc连接hive,查询数据时报错:

Exception in thread "main" java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: org.apache.hadoop.security.AccessControlException Permission denied: user=anonymous, access=WRITE, inode="/user/hive/warehouse/hivetest":asus:supergroup:drwxr-xr-x

 

原因:hdfs中的 /user/hive/warehouse没有读写权限

执行命令:hdfs dfs -chmod -R 777 /user 

Connected to: Apache Hive (version 3.1.2) Driver: Hive JDBC (version 3.1.2) Transaction isolation: TRANSACTION_REPEATABLE_READ Beeline version 3.1.2 by Apache Hive 0: jdbc:hive2://bigdata1:10000> use ods; No rows affected (0.145 seconds) 0: jdbc:hive2://bigdata1:10000> SHOW TABLES LIKE 'customer_inf'; +---------------+ | tab_name | +---------------+ | customer_inf | +---------------+ 1 row selected (0.452 seconds) 0: jdbc:hive2://bigdata1:10000> select * from ods.crstomer_inf;Closing: 0: jdbc:hive2://bigdata1:10000 ^C^C[root@bigdata1 ~]# ^C [root@bigdata1 ~]# beeline -u "jdbc:hive2://bigdata1:10000" -p root SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/opt/module/hadoop-3.1.3/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/opt/module/hive-3.1.2/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] Connecting to jdbc:hive2://bigdata1:10000 Connected to: Apache Hive (version 3.1.2) Driver: Hive JDBC (version 3.1.2) Transaction isolation: TRANSACTION_REPEATABLE_READ Beeline version 3.1.2 by Apache Hive 0: jdbc:hive2://bigdata1:10000> use ods; No rows affected (0.142 seconds) 0: jdbc:hive2://bigdata1:10000> select * from customer_inf; Error: Error while compiling statement: FAILED: SemanticException Unable to determine if hdfs://bigdata1:9000/user/hive/warehouse/ods.db/customer_inf is encrypted: org.apache.hadoop.security.AccessControlException: Permission denied: user=anonymous, access=EXECUTE, inode="/user":root:supergroup:drwx------ at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:399) at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:315) at org.apache.hadoop.
最新发布
03-09
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值