只需要在代码中加入
props.put("auto.offset.reset", "earliest");
props.put("group.id", UUID.randomUUID().toString());
props.put("group.id", UUID.randomUUID().toString());
完整例子
//1、准备配置文件
Properties props = new Properties();
props.put("bootstrap.servers", "hadoop1:9092");
props.put("acks", "all");
props.put("retries", 0);
props.put("batch.size", 16384);
props.put("linger.ms", 1);
props.put("buffer.memory", 33554432);
props.put("group.id", "test");
props.put("enable.auto.commit", "true");
props.put("auto.commit.interval.ms", "1000");
props.put("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
props.put("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer");
props.put("auto.offset.reset", "earliest");
props.put("group.id", UUID.randomUUID().toString());
// 2、创建KafkaConsumer
KafkaConsumer<String, String> kafkaConsumer = new KafkaConsumer<String, String>(props);
// 3、订阅数据,这里的topic可以是多个
kafkaConsumer.subscribe(Arrays.asList("yun03"));
// 4、获取数据
while (true) {
ConsumerRecords<String, String> records = kafkaConsumer.poll(100);
for (ConsumerRecord<String, String> record : records) {
System.out.printf("topic = %s,offset = %d, key = %s, value = %s%n",record.topic(), record.offset(), record.key(), record.value());
}
}
这篇博客详细介绍了如何使用Java代码配置Kafka消费者,使其能够从消息队列的开始位置读取消息,从而实现完整的从头开始消费的流程。
845

被折叠的 条评论
为什么被折叠?



