前言:在上面一节中,通过modules.d/kafka.yml的配置,完成了固定path下面日志文件读取到elasticsearch中并用kibana进行了搜索展示。而实际的应用过程中,流程是某个微服务应用程序把日志吐到kafka的某个topic中,filebeat以kafka的topic中数据作为输入,elasticsearch作为输出。这一节,我们就来实现这个过程。
先看filebeat与kafka版本的适配,有时候公司用的kafka版本万年不升级毫无办法,0.9.0.0不支持。
Compatibility
This input works with all Kafka versions in between 0.11 and 2.1.0. Older versions might work as well, but are not supported.
filebeat.inputs:
# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.
- type: kafka
hosts:
- 20.11.2.183:9092
# - kafka-broker-2:9092
- 10.11.55.11:9092
topics: ["test2","innerweb"]
group_id: "innerweb"
-- 控制台进行kafka topic的生产和消费
bin/kafka-console-producer.sh --bootrap-server localhost:9092 --topic innerweb
bin/kafka-console-consumer.sh --zookeeper localhost:2181 --topic innerweb --from-beginning