Kafka实战之整合Flume和Kafka完成实时数据采集

本文介绍如何通过修改Flume配置文件,实现从控制台输入数据到Kafka的实时数据采集流程。具体步骤包括配置Avro源、内存通道及Kafka接收器,将数据发送至指定的Kafka主题。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

一、整合Flume和Kafka完成实时数据采集 流程

 

 

二、修改flume配置文件,从控制台输出到kafka

/**
*之前的配置控制台文件
*
*/
[peng@bogon conf]$ cat avro-memory-logger.conf 
vro-memory-logger.sources = avro-source
avro-memory-logger.sinks = logger-sink
avro-memory-logger.channels = memory-channel

avro-memory-logger.sources.avro-source.type = avro
avro-memory-logger.sources.avro-source.bind = localhost
avro-memory-logger.sources.avro-source.port = 44444

avro-memory-logger.sinks.logger-sink.type = logger

avro-memory-logger.channels.memory-channel.type = memory

avro-memory-logger.sources.avro-source.channels=memory-channel
avro-memory-logger.sinks.logger-sink.channel=memory-channel
[peng@bogon conf]$ 

/**
*配置到kafka
*/

avro-memory-kafka.sources = avro-source
avro-memory-kafka.sinks = logger-sink
avro-memory-kafka.channels = memory-channel

avro-memory-kafka.sources.avro-source.type = avro
avro-memory-kafka.sources.avro-source.bind = localhost
avro-memory-kafka.sources.avro-source.port = 44444
    
avro-memory-kafka.sinks.kafka-sink.type = org.apache.flume.sink.kafka.KafkaSink
avro-memory-kafka.sinks.kafka-sink.brokerList = localhost:9092
avro-memory-kafka.sinks.kafka-sink.topic = peng_topic
avro-memory-kafka.sinks.kafka-sink.batchSize = 5
avro-memory-kafka.sinks.kafka-sink.requiredAcks = 1

avro-memory-kafka.channels.memory-channel.type = memory

avro-memory-kafka.sources.avro-source.channels=memory-channel
avro-memory-kafka.sinks.logger-sink.channel=memory-channel
                                                  


 

 

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值