Hadoop 2.2 uses protobuf 2.5 for its RPC, and Flume loads its older packaged version of protobuf ahead of Hadoop’s, which causes this error. To fix this you’ll need to move both protobuf and guava out of Flume’s lib directory. The following command moves them into your home directory.
$ mv ${flume_bin}/lib/{protobuf-java-2.4.1.jar,guava-10.0.1.jar} ~/
Now if you restart your Flume agent you’ll be able to target HDFS as a sink with Hadoop 2.2. Great success!
Flume’s next release will move to protobuf 2.5 so this problem should magically disappear in due course.
本文介绍了解决Flume加载旧版本protobuf导致与Hadoop2.2不兼容的问题。通过将protobuf和guava从Flume的lib目录移除,可以避免与Hadoop的版本冲突,使Flume能够正确地将数据写入HDFS。
4781

被折叠的 条评论
为什么被折叠?



