kafka提供者和消费者分离案例和相关问题解决(o.s.k.r.ReplyingKafkaTemplate : No pending reply: ConsumerRecord)

— [esContainer-C-1] o.s.k.r.ReplyingKafkaTemplate : No pending reply: ConsumerRecord

kafka用ReplyingKafkaTemplate发送成功返回的响应接收结果问题

进入源码org.springframework.kafka.requestreply.ReplyingKafkaTemplate<K, V, R>类中debug查看,
在这里插入图片描述
发现以下两个问题:

1.设置默认响应时间setDefaultReplyTimeout

	**replyingKafkaTemplate.setDefaultReplyTimeout(Duration.ofMinutes(5));**

2.没有指定监听响应的topics(Listener Container to be set up in ReplyingKafkaTemplate)

	首先在replyingKafkaTemplate.sendAndReceive(producerRecord)之前先设置topics的应答头;
	**producerRecord.headers()
            .add(new RecordHeader(KafkaHeaders.REPLY_TOPIC, "topic.quick.reply".getBytes()));**
	// 配置监听器容器Listener Container to be set up in ReplyingKafkaTemplate
**@Bean
public KafkaMessageListenerContainer<String,String> replyContainer() {
    // 设置的接收topic必须和设置的RecordHeader里的topic一致topic.quick.reply
    ContainerProperties containerProperties = new ContainerProperties("topic.quick.reply");
  			  return new KafkaMessageListenerContainer<>(new DefaultKafkaConsumerFactory<> (consumerConfigs()),containerProperties);
}**

OK了看完问题咋们来一波硬料!!!

kafka的安装部署参考:https://www.cnblogs.com/Robert-huge/p/5649826.html

一、Producer提供者

1.pom.xml代码

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>com.gzh.kafka.producer</groupId>
    <artifactId>producer</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <packaging>jar</packaging>

    <name>kafka-producer-master</name>
    <description>demo project for kafka producer</description>

    <parent>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-parent</artifactId>
        <version>2.2.6.RELEASE</version>
        <relativePath/> <!-- lookup parent from repository -->
    </parent>

    <properties>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
        <project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
        <spring-kafka.version>2.4.6.RELEASE</spring-kafka.version>
        <java.version>1.8</java.version>
    </properties>

    <dependencies>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter</artifactId>
        </dependency>
        
        <!-- https://mvnrepository.com/artifact/org.springframework.kafka/spring-kafka -->
        <dependency>
            <groupId>org.springframework.kafka</groupId>
            <artifactId>spring-kafka</artifactId>
           <!--  <version>${spring-kafka.version}</version> -->
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-web</artifactId>
        </dependency>

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-test</artifactId>
            <scope>test</scope>
        </dependency>
        <!-- https://mvnrepository.com/artifact/org.springframework.kafka/spring-kafka-test -->
        <dependency>
            <groupId>org.springframework.kafka</groupId>
            <artifactId>spring-kafka-test</artifactId>
            <version>${spring-kafka.version}</version>
            <scope>test</scope>
        </dependency>
        
    </dependencies>

    <build>
        <plugins>
            <plugin>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-maven-plugin</artifactId>
            </plugin>
        </plugins>
    </build>

</project>

2.application.properties

server.port=8000
spring.application.name=kafka-producer
#kafka configuration
spring.kafka.producer.bootstrap-servers=192.168.203.129:9092
spring.kafka.producer.key-serializer=org.apache.kafka.common.serialization.StringSerializer
spring.kafka.producer.value-serializer=org.apache.kafka.common.serialization.StringSerializer

spring.kafka.listener.missing-topics-fatal=false
#topic
kafka.app.topic.foo=test20200509

3.提供者启动类

package com.gzh.kafka.producer;

import java.time.Duration;
import java.util.HashMap;
import java.util.Map;

import org.apache.kafka.clients.consumer.ConsumerConfig;
import org.apache.kafka.common.serialization.StringDeserializer;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.context.properties.EnableConfigurationProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.kafka.annotation.EnableKafka;
import org.springframework.kafka.core.DefaultKafkaConsumerFactory;
import org.springframework.kafka.core.ProducerFactory;
import org.springframework.kafka.listener.ContainerProperties;
import org.springframework.kafka.listener.KafkaMessageListenerContainer;
import org.springframework.kafka.requestreply.ReplyingKafkaTemplate;

@SpringBootApplication
@EnableConfigurationProperties
@EnableKafka
public class KafkaProducerApplication {

    public static void main(String[] args) {
        SpringApplication.run(KafkaProducerApplication.class, args);
    }

    // 配件:监听器容器Listener Container to be set up in ReplyingKafkaTemplate
    @Bean
    public KafkaMessageListenerContainer<String,String> replyContainer() {
        // 设置的接收topic必须和设置的RecordHeader里的topic一致topic.quick.reply
        ContainerProperties containerProperties = new ContainerProperties("topic.quick.reply");
        return new KafkaMessageListenerContainer<>(new DefaultKafkaConsumerFactory<>(consumerConfigs()),
                containerProperties);
    }

    // 这是核心的ReplyingKafkaTemplate
    @Bean
    public ReplyingKafkaTemplate<String,String,String> replyingKafkaTemplate(
            ProducerFactory<String,String> producerFactory,
            KafkaMessageListenerContainer<String,String> repliesContainer) {
        ReplyingKafkaTemplate<String,String,String> replyingKafkaTemplate = new ReplyingKafkaTemplate<>(producerFactory,
                repliesContainer);
        // 响应时间设置
        replyingKafkaTemplate.setDefaultReplyTimeout(Duration.ofMinutes(5));
        return replyingKafkaTemplate;
    }

    @Bean
    public Map<String,Object> consumerConfigs() {
        Map<String,Object> props = new HashMap<>();
        props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "192.168.203.129:9092");
        // 默认的group_id
        props.put(ConsumerConfig.GROUP_ID_CONFIG, "repliesGroup");
        props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, true);
        props.put(ConsumerConfig.AUTO_COMMIT_INTERVAL_MS_CONFIG, "100");
        props.put(ConsumerConfig.SESSION_TIMEOUT_MS_CONFIG, "15000");
        props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
        props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
        return props;
    }

}

4.提供者控制类

package com.gzh.kafka.producer.controller;

import java.util.concurrent.ExecutionException;

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;

import com.gzh.kafka.producer.service.KafkaMessageSendService;

@RestController
// @RequestMapping(value = "send", produces = MediaType.APPLICATION_JSON_UTF8_VALUE)
@RequestMapping(value = "send")
public class KafkaMessageSendController {

    @Autowired
    private KafkaMessageSendService kafkaMessageSendService;

    @RequestMapping(value = "/sendMessage")
    public String send(@RequestParam(required = true) String message) {
        System.out.println("*****************" + message);
        try {
            kafkaMessageSendService.send(message);
        } catch (Exception e) {
            return "send failed.";
        }
        return message;
    }

    @RequestMapping(value = "/sendReceive")
    public String sendReceive(@RequestParam(required = true) String message)
            throws InterruptedException, ExecutionException {
        String retMsg = kafkaMessageSendService.sendReceive(message);
        return retMsg;
    }

}

5.提供者业务类

package com.gzh.kafka.producer.service;

import java.util.concurrent.ExecutionException;

import org.apache.kafka.clients.consumer.ConsumerRecord;
import org.apache.kafka.clients.producer.ProducerRecord;
import org.apache.kafka.common.header.internals.RecordHeader;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.kafka.requestreply.ReplyingKafkaTemplate;
import org.springframework.kafka.requestreply.RequestReplyFuture;
import org.springframework.kafka.support.KafkaHeaders;
import org.springframework.kafka.support.SendResult;
import org.springframework.stereotype.Service;
import org.springframework.util.concurrent.ListenableFuture;

@Service
public class KafkaMessageSendService {

    private static final Logger LOG = LoggerFactory.getLogger(KafkaMessageSendService.class);

    @Autowired
    private KafkaTemplate<String,String> kafkaTemplate;

    @Autowired
    private ReplyingKafkaTemplate<String,String,String> replyingKafkaTemplate;

    @Value("${kafka.app.topic.foo}")
    private String topic;

    public void send(String message) {
        LOG.info("topic=" + topic + ",message=" + message);
        ListenableFuture<SendResult<String,String>> future = kafkaTemplate.send(topic, message);
        future.addCallback(success -> LOG.info("KafkaMessageProducer 发送消息成功!"),
                fail -> LOG.error("KafkaMessageProducer 发送消息失败!"));
    }


    public String sendReceive(String message) throws InterruptedException, ExecutionException {
        LOG.info("topic=topic-s-r,message=" + message);
        ProducerRecord<String,String> producerRecord = new ProducerRecord<String,String>("topic-s-r", message);
        producerRecord.headers()
                .add(new RecordHeader(KafkaHeaders.REPLY_TOPIC, "topic.quick.reply".getBytes()));
        RequestReplyFuture<String,String,String> future = replyingKafkaTemplate.sendAndReceive(producerRecord);
        future.addCallback(success -> LOG.info("KafkaMessageProducer 发送消息成功!"),
                fail -> LOG.error("KafkaMessageProducer 发送消息失败!"));

        ConsumerRecord<String,String> consumerRecord = future.get();
        System.out.println("===============>Return value: " + consumerRecord.toString());
        return consumerRecord.value();
    }

}

6.提供者定时触发demo

package com.gzh.kafka.producer.component;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.kafka.support.SendResult;
import org.springframework.scheduling.annotation.EnableScheduling;
import org.springframework.scheduling.annotation.Scheduled;
import org.springframework.stereotype.Component;
import org.springframework.util.concurrent.ListenableFuture;

@Component
@EnableScheduling
public class KafkaMessageProducer {

    private static final Logger LOG = LoggerFactory.getLogger(KafkaMessageProducer.class);

    @Autowired
    private KafkaTemplate<String,String> kafkaTemplate;

    @Value("${kafka.app.topic.foo}")
    private String topic;

    @Scheduled(cron = "0 0/5 * * * ?")
    public void send() {
        String message = "Hello World---" + System.currentTimeMillis();
        LOG.info("topic=" + topic + ",message=" + message);
        ListenableFuture<SendResult<String,String>> future = kafkaTemplate.send(topic, message);
        future.addCallback(success -> LOG.info("KafkaMessageProducer 发送消息成功!"),
                fail -> LOG.error("KafkaMessageProducer 发送消息失败!"));
    }
}

二、 Consumer消费者

1.消费者pom.xml代码

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
    xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelVersion>4.0.0</modelVersion>

    <groupId>com.gzh.kafka.consumer</groupId>
    <artifactId>consumer</artifactId>
    <version>0.0.1-SNAPSHOT</version>
    <packaging>jar</packaging>

    <name>kafka-consumer-master</name>
    <description>demo project for kafka consumer</description>

    <parent>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-parent</artifactId>
        <version>2.2.6.RELEASE</version>
        <relativePath/> <!-- lookup parent from repository -->
    </parent>

    <properties>
        <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
        <project.reporting.outputEncoding>UTF-8</project.reporting.outputEncoding>
        <spring-kafka.version>2.4.6.RELEASE</spring-kafka.version>
        <java.version>1.8</java.version>
    </properties>

    <dependencies>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter</artifactId>
        </dependency>
        
        <!-- https://mvnrepository.com/artifact/org.springframework.kafka/spring-kafka -->
        <dependency>
            <groupId>org.springframework.kafka</groupId>
            <artifactId>spring-kafka</artifactId>
            <version>2.4.6.RELEASE</version>
        </dependency>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-web</artifactId>
        </dependency>

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-test</artifactId>
            <scope>test</scope>
        </dependency>
        <!-- https://mvnrepository.com/artifact/org.springframework.kafka/spring-kafka-test -->
        <dependency>
            <groupId>org.springframework.kafka</groupId>
            <artifactId>spring-kafka-test</artifactId>
            <version>2.4.6.RELEASE</version>
            <scope>test</scope>
        </dependency>
    </dependencies>

    <build>
        <plugins>
            <plugin>
                <groupId>org.springframework.boot</groupId>
                <artifactId>spring-boot-maven-plugin</artifactId>
            </plugin>
        </plugins>
    </build>
</project>

2.消费者application.properties编码

server.port=8001
spring.application.name=kafka-consumer
#kafka configuration
#\u6307\u5b9a\u6d88\u606f\u88ab\u6d88\u8d39\u4e4b\u540e\u81ea\u52a8\u63d0\u4ea4\u504f\u79fb\u91cf\uff0c\u4ee5\u4fbf\u4e0b\u6b21\u7ee7\u7eed\u6d88\u8d39
spring.kafka.consumer.enable-auto-commit=true
#\u6307\u5b9a\u6d88\u606f\u7ec4
spring.kafka.consumer.group-id=repliesGroup
#\u6307\u5b9akafka\u670d\u52a1\u5668\u5730\u5740
spring.kafka.consumer.bootstrap-servers=192.168.203.129:9092
#\u6307\u5b9a\u4ece\u6700\u8fd1\u5730\u65b9\u5f00\u59cb\u6d88\u8d39(earliest)
spring.kafka.consumer.auto-offset-reset=latest
spring.kafka.listener.missing-topics-fatal=false
spring.kafka.consumer.key-deserializer=org.apache.kafka.common.serialization.StringDeserializer
spring.kafka.consumer.value-deserializer=org.apache.kafka.common.serialization.StringDeserializer
#topic
kafka.app.topic.foo=test20200509

3.消费者启动控制类

package com.gzh.kafka.consumer;

import java.util.HashMap;
import java.util.Map;

import org.apache.kafka.clients.consumer.ConsumerConfig;
import org.apache.kafka.clients.producer.ProducerConfig;
import org.apache.kafka.common.serialization.StringDeserializer;
import org.apache.kafka.common.serialization.StringSerializer;
import org.springframework.boot.SpringApplication;
import org.springframework.boot.autoconfigure.SpringBootApplication;
import org.springframework.boot.context.properties.EnableConfigurationProperties;
import org.springframework.context.annotation.Bean;
import org.springframework.kafka.config.ConcurrentKafkaListenerContainerFactory;
import org.springframework.kafka.config.KafkaListenerContainerFactory;
import org.springframework.kafka.core.ConsumerFactory;
import org.springframework.kafka.core.DefaultKafkaConsumerFactory;
import org.springframework.kafka.core.DefaultKafkaProducerFactory;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.kafka.core.ProducerFactory;
import org.springframework.kafka.listener.ConcurrentMessageListenerContainer;

@SpringBootApplication
@EnableConfigurationProperties
public class KafkaConsumerApplication {

    public static void main(String[] args) {
        SpringApplication.run(KafkaConsumerApplication.class, args);
    }

    @Bean
    public Map<String,Object> consumerConfigs() {
        Map<String,Object> props = new HashMap<>();
        props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, "192.168.203.129:9092");
        // 默认的group_id
        props.put(ConsumerConfig.GROUP_ID_CONFIG, "repliesGroup");
        props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, true);
        props.put(ConsumerConfig.AUTO_COMMIT_INTERVAL_MS_CONFIG, "100");
        props.put(ConsumerConfig.SESSION_TIMEOUT_MS_CONFIG, "15000");
        props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
        props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
        return props;
    }

    // 配件:生产者工厂Default Producer Factory to be used in ReplyingKafkaTemplate
    @Bean
    public ProducerFactory<String,String> producerFactory() {
        return new DefaultKafkaProducerFactory<>(producerConfigs());
    }

    // 配件:kafka生产者的Kafka配置Standard KafkaProducer settings - specifying brokerand serializer
    @Bean
    public Map<String,Object> producerConfigs() {
        Map<String,Object> props = new HashMap<>();
        props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, "192.168.203.129:9092");
        props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
        props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
        return props;
    }
    
    @Bean
    public ConsumerFactory<String, String> consumerFactory() {
        return new DefaultKafkaConsumerFactory<>(consumerConfigs(), new StringDeserializer(),
                new StringDeserializer());
    }
    
    // 并发监听器容器Concurrent Listner container factory
    @Bean
    public KafkaListenerContainerFactory<ConcurrentMessageListenerContainer<String,String>> kafkaListenerContainerFactory() {
        ConcurrentKafkaListenerContainerFactory<String,String> factory = new ConcurrentKafkaListenerContainerFactory<>();
        factory.setConsumerFactory(consumerFactory());
        // NOTE - set up of reply template 设置响应模板,针对@sendTo
        factory.setReplyTemplate(kafkaTemplate());
        return factory;
    }
    
    // Standard KafkaTemplate
    @Bean
    public KafkaTemplate<String,String> kafkaTemplate() {
        return new KafkaTemplate<>(producerFactory());
    }
}

4.消费者service类

package com.gzh.kafka.consumer.service;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.kafka.annotation.KafkaListener;
import org.springframework.messaging.MessageHeaders;
import org.springframework.messaging.handler.annotation.Headers;
import org.springframework.messaging.handler.annotation.Payload;
import org.springframework.messaging.handler.annotation.SendTo;
import org.springframework.stereotype.Component;


@Component
public class KafkaMessageConsumer {

    private static final Logger LOG = LoggerFactory.getLogger(KafkaMessageConsumer.class);

    @KafkaListener(topics = { "${kafka.app.topic.foo}" })
    public void receive(@Payload String message, @Headers MessageHeaders headers) {
        LOG.info("KafkaMessageConsumer 接收到消息:" + message);
        headers.keySet()
                .forEach(key -> LOG.info("{}: {}", key, headers.get(key)));
    }

    @KafkaListener(id = "replyConsumer", topics = { "topic-s-r" }, containerFactory = "kafkaListenerContainerFactory")
    @SendTo
    public String returnReceive(@Payload String message, @Headers MessageHeaders headers) {
        LOG.info("KafkaMessageConsumer 接收到消息:" + message + ",并返回消息确认处理了");
        headers.keySet()
                .forEach(key -> LOG.info("{}: {}", key, headers.get(key)));
        return "return-Message";
    }
}
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值