项目场景:
实时监控目录下多个被追加文件、
往Linux本地文件夹中文件实时追加内容,让Flume的source组件进行监控,把数据采集到目的地(输出到HDFS)
问题描述:
在监控到文件产生追加行为后,将数据采集到hdfs下,报错!同时hdfs不会产生相应目录:
java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1357) ~[hadoop-common-3.1.3.jar:?]
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1338) ~[hadoop-common-3.1.3.jar:?]
at org.apache.hadoop.conf.Configuration.setBoolean(Configuration.java:1679) ~[hadoop-common-3.1.3.jar:?]
at org.apache.flume.sink.hdfs.BucketWriter.open(BucketWriter.java:221) ~[flume-hdfs-sink-1.10.1.jar:1.10.1]
at org.apache.flume.sink.hdfs.BucketWriter.append(BucketWriter.java:572) ~[flume-hdfs-sink-1.10.1.jar:1.10.1]
at org.apache.flume.sink.hdfs.HDFSEventSink.process(HDFSEventSink.java:410) ~[flume-hdfs-sink-1.10.1.jar:1.10.1]
at org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:39) ~[flume-ng-core-1.10.1.jar:1.10.1]
at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:145) ~[flume-ng-core-1.10.1.jar:1.10.1]
at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_212]
Exception in thread "SinkRunner-PollingRunner-DefaultSinkProcessor" java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1357)
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1338)
at org.apache.hadoop.conf.Configuration.setBoolean(Configuration.java:1679)
at org.apache.flume.sink.hdfs.BucketWriter.open(BucketWriter.java:221)
at org.apache.flume.sink.hdfs.BucketWriter.append(BucketWriter.java:572)
at org.apache.flume.sink.hdfs.HDFSEventSink.process(HDFSEventSink.java:410)
at org.apache.flume.sink.DefaultSinkProcessor.process(DefaultSinkProcessor.java:39)
at org.apache.flume.SinkRunner$PollingRunner.run(SinkRunner.java:145)
at java.lang.Thread.run(Thread.java:748)
原因分析:
我的问题是hadoop版本较低,与Flume版本产生冲突,具体冲突表现为hadoop的lib目录下的guava包和flume的lib下guava包冲突了,hadoop中的版本过低
解决方案:
hadoop的lib目录下的guava包和flume的lib下guava包冲突了,hadoop中的版本过低,将flume的lib下的guava删除即可解决