org/apache/hadoop/hive/shims/ShimLoader

在运行sqoop导入工具时遇到了一个异常错误,具体表现为在加载Hive配置时找不到ShimLoader类。该问题可能源于Hive库的缺失或版本不匹配。解决方案通常涉及检查Hadoop和Hive的依赖关系,确保所有相关jar包已正确导入并兼容。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/hive/shims/ShimLoader
        at org.apache.hadoop.hive.conf.HiveConf$ConfVars.<clinit>(HiveConf.java:371)
        at org.apache.hadoop.hive.conf.HiveConf.<clinit>(HiveConf.java:108)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:264)
        at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:44)
        at org.apache.sqoop.hive.HiveImport.getHiveArgs(HiveImport.java:392)
        at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:379)
        at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:337)
        at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
&nb

[root@hadoop201 lib]# ls -l /opt/bdp/hive-4.0.1/lib/orc-*.jar -rw-r--r-- 1 root root 1157143 9月 25 2024 /opt/bdp/hive-4.0.1/lib/orc-core-1.8.5.jar -rw-r--r-- 1 root root 28998 9月 25 2024 /opt/bdp/hive-4.0.1/lib/orc-shims-1.8.5.jar -rw-r--r-- 1 root root 132316 9月 25 2024 /opt/bdp/hive-4.0.1/lib/orc-tools-1.8.5.jar [root@hadoop201 lib]# ls -l /opt/bdp/hive-4.0.1/lib/protobuf-java-*.jar -rw-r--r-- 1 root root 1838876 9月 25 2024 /opt/bdp/hive-4.0.1/lib/protobuf-java-3.24.4.jar [root@hadoop201 lib]# ls -l /opt/bdp/hadoop-3.4.0/share/hadoop/common/lib/protobuf-java-*.jar ls: 无法访问/opt/bdp/hadoop-3.4.0/share/hadoop/common/lib/protobuf-java-*.jar: 没有那个文件或目录 hadoop中没有这个jar包 , 向ORC表中插入数据报错:Diagnostic Messages for this Task: Error: java.io.IOException: java.lang.reflect.InvocationTargetException at org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderCreationException(HiveIOExceptionHandlerChain.java:97) at org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderCreationException(HiveIOExceptionHandlerUtil.java:57) at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:268) at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.<init>(HadoopShimsSecure.java:214) at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileInputFormatShim.getRecordReader(HadoopShimsSecure.java:342) at org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:711) at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.<init>(MapTask.java:176) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:445) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:350) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:178) at java.base/java.security.AccessController.doPrivileged(Native Method) at java.base/javax.security.auth.Subject.doAs(Subject.java:423) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1953) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:172)
最新发布
03-28
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值