hbase thrift2 jar包冲突导致启动失败问题排查记录

文章讲述了在启动HBaseThrift服务时遇到的AbstractMethodError异常,原因是jar包版本冲突。作者通过分析日志和配置文件,发现是MetricsConfig类的jar包与实际需要的版本不符,通过自定义类加载器加载的类方法参数不一致。最后通过创建测试程序找到并解决了jar包冲突问题。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

1、启动命令

${HBASE_HOME}/bin/hbase-daemon.sh start thrift2

2、异常情况

hbase-root-thrift2-hdfs-test07.yingzi.com.out异常日志:

Exception in thread "main" java.lang.AbstractMethodError: org.apache.hadoop.metrics2.sink.timeline.HadoopTimelineMetricsSink.init(Lorg/apache/commons/configuration/SubsetConfiguration;)V
    at org.apache.hadoop.metrics2.impl.MetricsConfig.getPlugin(MetricsConfig.java:199)
    at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.newSink(MetricsSystemImpl.java:529)
    at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.configureSinks(MetricsSystemImpl.java:501)
    at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.configure(MetricsSystemImpl.java:480)
    at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.start(MetricsSystemImpl.java:189)
    at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.init(MetricsSystemImpl.java:164)
    at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.init(DefaultMetricsSystem.java:54)
    at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.initialize(DefaultMetricsSystem.java:50)
    at org.apache.hadoop.hbase.metrics.BaseSourceImpl$DefaultMetricsSystemInitializer.init(BaseSourceImpl.java:51)
    at org.apache.hadoop.hbase.metrics.BaseSourceImpl.<init>(BaseSourceImpl.java:112)
    at org.apache.hadoop.hbase.metrics.ExceptionTrackingSourceImpl.<init>(ExceptionTrackingSourceImpl.java:44)
    at org.apache.hadoop.hbase.thrift.MetricsThriftServerSourceImpl.<init>(MetricsThriftServerSourceImpl.java:59)
    at org.apache.hadoop.hbase.thrift.MetricsThriftServerSourceFactoryImpl$FactoryStorage.<init>(MetricsThriftServerSourceFactoryImpl.java:35)
    at org.apache.hadoop.hbase.thrift.MetricsThriftServerSourceFactoryImpl$FactoryStorage.<clinit>(MetricsThriftServerSourceFactoryImpl.java:34)
    at org.apache.hadoop.hbase.thrift.MetricsThriftServerSourceFactoryImpl.createThriftTwoSource(MetricsThriftServerSourceFactoryImpl.java:52)
    at org.apache.hadoop.hbase.thrift.ThriftMetrics.<init>(ThriftMetrics.java:72)
    at org.apache.hadoop.hbase.thrift2.ThriftServer.run(ThriftServer.java:473)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.hadoop.hbase.thrift2.ThriftServer.main(ThriftServer.java:374)

3、问题分析&排查

这个报错基本可以认定是jar包冲突导致的,不过问题的难点在于如何定位到有问题的jar

之前遇到过hive由于jar冲突导致的启动失败问题,记录在:一个Hive curator-client.jar包冲突问题排查解决_org.apache.curator.shaded_Java小田的博客-优快云博客

排查问题的思路可以借鉴,不过hbase比hive的情况更复杂一点

通过日志和查看代码,找到一个关键的配置文件:/usr/hdp/3.0.1.0-187/hbase/conf/hadoop-metrics2-hbase.properties

其中比较关键的配置如下:

*.timeline.plugin.urls=file:///usr/lib/ambari-metrics-hadoop-sink/ambari-metrics-hadoop-sink.jar
hbase.sink.timeline.class=org.apache.hadoop.metrics2.sink.timeline.HadoopTimelineMetricsSink

大致逻辑是,通过一个自定义类加载器,加载外部jar:/usr/lib/ambari-metrics-hadoop-sink/ambari-metrics-hadoop-sink.jar中的org.apache.hadoop.metrics2.sink.timeline.HadoopTimelineMetricsSink类,并调用init方法

把/usr/lib/ambari-metrics-hadoop-sink/ambari-metrics-hadoop-sink.jar下载到本地反编译

然后找到org.apache.hadoop.metrics2.sink.timeline.HadoopTimelineMetricsSink这个类,找到init(SubsetConfiguration conf)方法

仔细看可以发现这里的SubsetConfiguration的包名是org.apache.commons.configuration2.SubsetConfiguration:

而根据上面的异常日志:Exception in thread "main" java.lang.AbstractMethodError: org.apache.hadoop.metrics2.sink.timeline.HadoopTimelineMetricsSink.init(Lorg/apache/commons/configuration/SubsetConfiguration;)V

可以发现两边SubsetConfiguration的包名是不同的,调用的方法和实际加载的类方法,参数不一致,导致了AbstractMethodError

而HadoopTimelineMetricsSink类的加载和init()的方法调用是在org.apache.hadoop.metrics2.impl.MetricsConfig.getPlugin()中进行的,大概率可以推断是MetricsConfig类所在的jar包存在版本冲突

还是通过自己写一个程序来查找这个jar,参考一个Hive curator-client.jar包冲突问题排查解决_org.apache.curator.shaded_Java小田的博客-优快云博客

中的maven工程

Test.java代码改为:

package com.tianzy.test;
 
public class Test {
    public static void main(String[] args) {
        try {
            String filePath = Class.forName("org.apache.hadoop.metrics2.impl.MetricsConfig").getProtectionDomain().getCodeSource().getLocation().getFile();
            System.out.println(filePath);
        } catch (ClassNotFoundException e) {
            e.printStackTrace();
        }
    }
}

执行mvn package命令打包成hive_test-1.0-SNAPSHOT.jar,上传到服务器上,放到/tmp/test/目录下

接下来比较难办的是要找到hbase thrift2程序的classpath,由于启动脚本比较复杂,想准确找到classpath比较麻烦。

这里采用了偷懒的办法,直接拷贝hbase regionserver进程的classpath:先找到regionserver的pid,然后从/proc/${pid}/environ文件中找到CLASSPATH,拷贝到/tmp/test/habase-classpath.txt文件

再执行命令:

cd /tmp/test
export CLASSPATH=`cat /tmp/test/habase-classpath.txt`:/tmp/test/hive_hook_test-1.0-SNAPSHOT.jar
java com.tianzy.test.TestHbase

输出结果:/data/hdp/3.0.1.0-187/hbase/lib/flink-realtime-data-hbase-1.0-SNAPSHOT.jar

这个一看明显就是自己打的jar包上传的,是谁上传的已经不可考,直接删掉后再尝试启动hbase thrift2,启动成功

which: no hbase in (/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/usr/local/java/jdk1.8/bin:/opt/software/hadoop/hadoop-2.9.2/bin:/opt/software/hadoop/hadoop-2.9.2/sbin:/opt/software/zookeeper-3.9.3/bin:/usr/local/scala/bin:/root/bin:/usr/local/java/jdk1.8/bin:/opt/software/hadoop/hadoop-2.9.2/bin:/opt/software/hadoop/hadoop-2.9.2/sbin:/opt/software/zookeeper-3.9.3/bin:/usr/local/scala/bin:/opt/apache-hive-2.3.7-bin/bin) SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/opt/apache-hive-2.3.7-bin/lib/log4j-slf4j-impl-2.6.2.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/opt/software/hadoop/hadoop-2.9.2/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] Exception in thread "main" java.lang.RuntimeException: com.ctc.wstx.exc.WstxParsingException: Unexpected close tag </configuration>; expected </property>. at [row,col,system-id]: [9,15,"file:/opt/apache-hive-2.3.7-bin/conf/hive-site.xml"] at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2961) at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2730) at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2605) at org.apache.hadoop.conf.Configuration.get(Configuration.java:1362) at org.apache.hadoop.hive.conf.HiveConf.getVar(HiveConf.java:3904) at org.apache.hadoop.hive.conf.HiveConf.getVar(HiveConf.java:3973) at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:4060) at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:4003) at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:81) at org.apache.hadoop.hive.common.LogUtils.initHiveLog4j(LogUtils.java:65) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:702) at org.apache.hadoop.hi
最新发布
03-24
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值