tplink怎么进去_tp login.cn登陆入口进不去怎么办

本文详细介绍了当无法访问TP-Link路由器tplogin.cn管理页面时的解决方法,包括检查输入地址、确认浏览器设置、网线连接、手机连接及尝试重启路由器。如果所有步骤都无法解决问题,建议将路由器恢复出厂设置并重新配置。

新版TP-Link路由器的管理页面地址是:tplogin.cn,如果在设置的时候,无法登录到tplogin.cn管理页面,请仔细阅读下面的内容。

1、请确保你在浏览器中输入的是:tplogin.cn,注意在tplogin后面有一个实心小圆点。tplogincn、tplogincn.com、tp1ogin.cn等都是错误的,建议输入后在仔细检查一次,可以对照路由器背面的标签看看地址到底是多少。

2、打开浏览器之后顶部能显示网址,并且页面中还有一个搜索框。这种情况下,一定要在顶部显示网址的地方输入:tplogin.cn,如下图所示。

如果在错误的搜索框内输入的tplogin.cn,会出现一些搜索结果的教程地址,或者打不开任何的页面,所以请把路由器管理页面地址输入在正确的位置。

3、检查下tplink路由器中的网线连接是否正确,连接错误,则会出现tplogin.cn登录不了的问题。tplink路由器、猫(宽带网线)、电脑 之间的正确连接方式:把从猫上接出来的网线(入户的宽带网线),插在tplink路由器的WAN接口。tplink路由器中除WAN口外的任意一个接口,都可以用来连接电脑,连接的示意图如下。

4、用手机来登录的用户得让你的手机一定要连接到tplink路由器的无线信号。如果手机连接其它路由器的wifi信号,或者用的移动数据流量,也无法登录tplogin.cn。

如果你的tplink路由器是第一次进行设置,手机直接连接到tplink路由器的默认wifi信号即可。连接后手机不能上是属于正常现象,无需担心,在浏览器中输入:tplogin.cn就能打开设置页面。

5、以上四步操作都没问题,还是出现无法登录tplogin.cn,这时候可以拔掉tplink路由器的电源,等几分钟后在接通电源,再次测试能否登录tplogin.cn 页面。

6、上面的方法的都试了,怎么都无法登录,现在只有一个最简单粗暴的办法了,就是建议把你的tplink路由器恢复出厂设置,然后重新设置上网,请参考:tp-link路由器复位(恢复出厂设置)的操作方法。

现在发你一个springboot框架,分析一下,给出项目目录。package com.tplink.nbu.demo.basicspringboot.annotation; import java.lang.annotation.Documented; import java.lang.annotation.ElementType; import java.lang.annotation.Retention; import java.lang.annotation.RetentionPolicy; import java.lang.annotation.Target; /** * @author liufuyongle@tp-link.com.cn * @version 1.0 * @since 2020/7/13 */ @Target({ElementType.METHOD}) @Retention(RetentionPolicy.RUNTIME) @Documented public @interface UserAuth { } /* * Copyright (c) 2020, TP-Link Co.,Ltd. All rights reserved. */ package com.tplink.nbu.demo.basicspringboot.aspect; import javax.servlet.http.HttpServletRequest; import lombok.extern.slf4j.Slf4j; import org.aspectj.lang.ProceedingJoinPoint; import org.aspectj.lang.annotation.Around; import org.aspectj.lang.annotation.Aspect; import org.aspectj.lang.annotation.Pointcut; import org.springframework.http.HttpHeaders; import org.springframework.stereotype.Component; import org.springframework.web.context.request.RequestContextHolder; import org.springframework.web.context.request.ServletRequestAttributes; import com.tplink.nbu.demo.basicspringboot.bean.UserInfo; import com.tplink.nbu.demo.basicspringboot.exception.UnauthorizedException; /** * @author liufuyongle@tp-link.com.cn * @version 1.0 * @since 2020/7/13 */ @Slf4j @Aspect @Component public class UserAuthAspect { @Pointcut("execution(public * com.tplink.nbu.demo.basicspringboot.controller.*.*(..))") public void controllers() { // controller pointcut definition } @Pointcut("@annotation(com.tplink.nbu.demo.basicspringboot.annotation.UserAuth)") public void needUserAuth() { // need user auth pointcut definition } @Around("controllers() && needUserAuth()") public Object arround(ProceedingJoinPoint pjp) throws Throwable { ServletRequestAttributes requestAttributes = (ServletRequestAttributes) RequestContextHolder .getRequestAttributes(); HttpServletRequest request = requestAttributes.getRequest(); String credential = this.getCredential(request); if (credential == null) { throw new UnauthorizedException(); } Object[] args = pjp.getArgs(); Object[] newArgs = new Object[args.length]; for (int i = 0; i < args.length; i++) { newArgs[i] = this.checkAndAssignUserInfo(args[i], new UserInfo(credential)); } return pjp.proceed(newArgs); } private Object checkAndAssignUserInfo(Object newArg, UserInfo userInfo) { if (newArg instanceof UserInfo) { return userInfo; } return newArg; } private String getCredential(HttpServletRequest request) { return request.getHeader(HttpHeaders.AUTHORIZATION); } } /* * Copyright (c) 2020, TP-Link Co.,Ltd. All rights reserved. */ package com.tplink.nbu.demo.basicspringboot.controller; import javax.validation.Valid; import lombok.extern.slf4j.Slf4j; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.web.bind.annotation.PostMapping; import org.springframework.web.bind.annotation.RequestBody; import org.springframework.web.bind.annotation.RequestMapping; import org.springframework.web.bind.annotation.RestController; import com.tplink.nbu.demo.basicspringboot.annotation.UserAuth; import com.tplink.nbu.demo.basicspringboot.bean.UserInfo; import com.tplink.nbu.demo.basicspringboot.dto.UserLoginDTO; import com.tplink.nbu.demo.basicspringboot.dto.UserLoginSuccessDTO; import com.tplink.nbu.demo.basicspringboot.exception.UnauthorizedException; import com.tplink.nbu.demo.basicspringboot.service.UserService; /** * @author liufuyongle@tp-link.com.cn * @version 1.0 * @since 2020/7/13 */ @Slf4j @RestController @RequestMapping("/user") public class UserLogInOutController { @Autowired private UserService userService; @PostMapping("/login") public UserLoginSuccessDTO login(@Valid @RequestBody UserLoginDTO loginDTO) { boolean auth = userService.auth(loginDTO); if (!auth) { throw new UnauthorizedException(); } log.info("{} login", loginDTO.getUsername()); return UserLoginSuccessDTO.builder() .token(loginDTO.getUsername()) .build(); } @UserAuth @PostMapping("/logout") public UserInfo logout(UserInfo userInfo) { log.info("{} logout", userInfo.getUsername()); return userInfo; } } /* * Copyright (c) 2020, TP-Link Co.,Ltd. All rights reserved. */ package com.tplink.nbu.demo.basicspringboot.dto; import javax.validation.constraints.NotEmpty; import lombok.Getter; import lombok.Setter; /** * @author liufuyongle@tp-link.com.cn * @version 1.0 * @since 2020/7/13 */ @Getter @Setter public class UserLoginDTO { @NotEmpty private String username; @NotEmpty private String password; } /* * Copyright (c) 2020, TP-Link Co.,Ltd. All rights reserved. */ package com.tplink.nbu.demo.basicspringboot.dto; import lombok.Builder; import lombok.Getter; /** * @author liufuyongle@tp-link.com.cn * @version 1.0 * @since 2020/7/13 */ @Getter @Builder public class UserLoginSuccessDTO { private String token; } /* * Copyright (c) 2020, TP-Link Co.,Ltd. All rights reserved. */ package com.tplink.nbu.demo.basicspringboot.exception; import org.springframework.http.HttpStatus; import org.springframework.web.bind.annotation.ResponseStatus; /** * @author liufuyongle@tp-link.com.cn * @version 1.0 * @since 2020/7/13 */ @ResponseStatus(value = HttpStatus.UNAUTHORIZED) public class UnauthorizedException extends RuntimeException { } package com.tplink.nbu.demo.basicspringboot.service; import com.tplink.nbu.demo.basicspringboot.dto.UserLoginDTO; /** * @author liufuyongle@tp-link.com.cn * @version 1.0 * @since 2020/7/13 */ public interface UserService { boolean auth(UserLoginDTO loginDTO); } /* * Copyright (c) 2020, TP-Link Co.,Ltd. All rights reserved. */ package com.tplink.nbu.demo.basicspringboot.service; import org.springframework.stereotype.Service; import com.tplink.nbu.demo.basicspringboot.dto.UserLoginDTO; /** * @author liufuyongle@tp-link.com.cn * @version 1.0 * @since 2020/7/13 */ @Service public class UserServiceImpl implements UserService { @Override public boolean auth(UserLoginDTO loginDTO) { return true; } } package com.tplink.nbu.demo.basicspringboot; import org.springframework.boot.SpringApplication; import org.springframework.boot.autoconfigure.SpringBootApplication; @SpringBootApplication public class BasicSpringBootApplication { public static void main(String[] args) { SpringApplication.run(BasicSpringBootApplication.class, args); } } * Copyright (c) 2020, TP-Link Co.,Ltd. All rights reserved. */ package com.tplink.nbu.demo.basicspringboot.bean; import lombok.AllArgsConstructor; import lombok.Getter; import lombok.NoArgsConstructor; import lombok.Setter; /** * @author liufuyongle@tp-link.com.cn * @version 1.0 * @since 2020/7/13 */ @Getter @Setter @NoArgsConstructor @AllArgsConstructor public class UserInfo { private String username; }
08-21
2025-10-14 11:41:13.070 INFO 928 --- [ main] c.t.n.demo.basicspringboot.DelayConsume : Starting DelayConsume using Java 1.8.0_462-462 on 18088363-BG with PID 928 (D:\r\idmdemo\target\classes started by admin in D:\r\idmdemo) 2025-10-14 11:41:13.071 INFO 928 --- [ main] c.t.n.demo.basicspringboot.DelayConsume : No active profile set, falling back to 1 default profile: "default" 2025-10-14 11:41:13.700 INFO 928 --- [ main] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat initialized with port(s): 8080 (http) 2025-10-14 11:41:13.707 INFO 928 --- [ main] o.apache.catalina.core.StandardService : Starting service [Tomcat] 2025-10-14 11:41:13.707 INFO 928 --- [ main] org.apache.catalina.core.StandardEngine : Starting Servlet engine: [Apache Tomcat/9.0.71] 2025-10-14 11:41:13.772 INFO 928 --- [ main] o.a.c.c.C.[Tomcat].[localhost].[/] : Initializing Spring embedded WebApplicationContext 2025-10-14 11:41:13.772 INFO 928 --- [ main] w.s.c.ServletWebServerApplicationContext : Root WebApplicationContext: initialization completed in 678 ms 2025-10-14 11:41:14.235 INFO 928 --- [ main] o.s.b.a.e.web.EndpointLinksResolver : Exposing 1 endpoint(s) beneath base path '/actuator' 2025-10-14 11:41:14.264 INFO 928 --- [ main] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat started on port(s): 8080 (http) with context path '' 2025-10-14 11:41:14.274 INFO 928 --- [ main] c.t.n.demo.basicspringboot.DelayConsume : Started DelayConsume in 1.416 seconds (JVM running for 1.719) 2025-10-14 11:41:14.292 INFO 928 --- [ main] c.t.s.e.port.kafka.KafkaEventCenter : start to register topic: delay_topic_level_2, groupId: delay_consumer_group 2025-10-14 11:41:14.298 INFO 928 --- [ main] c.t.s.e.port.kafka.KafkaEventCenter : start to register topic: vms_dlq_hello-topic, groupId: delay 2025-10-14 11:41:14.300 INFO 928 --- [ main] c.t.s.e.port.kafka.KafkaEventCenter : DelayMs=2000, level=2, expirationTime=1760413276300 2025-10-14 11:41:14.310 INFO 928 --- [_consumer_group] o.a.k.clients.consumer.ConsumerConfig : ConsumerConfig values: allow.auto.create.topics = true auto.commit.interval.ms = 5000 auto.offset.reset = latest bootstrap.servers = [192.168.203.128:9092] check.crcs = true client.dns.lookup = use_all_dns_ips client.id = consumer_10.13.35.30_76a6ca6f-b188-440c-88e5-aec400ec5ae7 client.rack = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = delay_consumer_group group.instance.id = null heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true internal.throw.on.fetch.stable.offset.unsupported = false isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 200 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [org.apache.kafka.clients.consumer.CooperativeStickyAssignor] receive.buffer.bytes = -1 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 session.timeout.ms = 45000 socket.connection.setup.timeout.max.ms = 30000 socket.connection.setup.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.certificate.chain = null ssl.keystore.key = null ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.2 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.certificates = null ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer 2025-10-14 11:41:14.310 INFO 928 --- [llo-topic_delay] o.a.k.clients.consumer.ConsumerConfig : ConsumerConfig values: allow.auto.create.topics = true auto.commit.interval.ms = 5000 auto.offset.reset = latest bootstrap.servers = [192.168.203.128:9092] check.crcs = true client.dns.lookup = use_all_dns_ips client.id = consumer_10.13.35.30_05508b8f-0b32-45be-b949-9d790b839ad2 client.rack = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = delay group.instance.id = null heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true internal.throw.on.fetch.stable.offset.unsupported = false isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 200 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [org.apache.kafka.clients.consumer.CooperativeStickyAssignor] receive.buffer.bytes = -1 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 session.timeout.ms = 45000 socket.connection.setup.timeout.max.ms = 30000 socket.connection.setup.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.certificate.chain = null ssl.keystore.key = null ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.2 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.certificates = null ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer 2025-10-14 11:41:14.326 INFO 928 --- [ main] o.a.k.clients.producer.ProducerConfig : ProducerConfig values: acks = -1 batch.size = 4096 bootstrap.servers = [192.168.203.128:9092] buffer.memory = 33554432 client.dns.lookup = use_all_dns_ips client.id = producer-1 compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 120000 enable.idempotence = false interceptor.classes = [] internal.auto.downgrade.txn.commit = false key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 1 max.block.ms = 60000 max.in.flight.requests.per.connection = 5 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 socket.connection.setup.timeout.max.ms = 30000 socket.connection.setup.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.certificate.chain = null ssl.keystore.key = null ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.2 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.certificates = null ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer 2025-10-14 11:41:14.345 INFO 928 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka version: 2.8.0 2025-10-14 11:41:14.345 INFO 928 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: ebb1d6e21cc92130 2025-10-14 11:41:14.345 INFO 928 --- [ main] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1760413274344 2025-10-14 11:41:14.355 INFO 928 --- [_consumer_group] o.a.kafka.common.utils.AppInfoParser : Kafka version: 2.8.0 2025-10-14 11:41:14.355 INFO 928 --- [_consumer_group] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: ebb1d6e21cc92130 2025-10-14 11:41:14.355 INFO 928 --- [_consumer_group] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1760413274355 2025-10-14 11:41:14.356 INFO 928 --- [_consumer_group] c.t.s.e.p.k.c.GenericKafkaConsumerTask : start to consumer kafka topic: delay_topic_level_2 2025-10-14 11:41:14.356 INFO 928 --- [llo-topic_delay] o.a.kafka.common.utils.AppInfoParser : Kafka version: 2.8.0 2025-10-14 11:41:14.356 INFO 928 --- [_consumer_group] c.t.s.e.p.k.c.AbstractTaskService : KafkaConsumerTask is running! topic:delay_topic_level_2 2025-10-14 11:41:14.356 INFO 928 --- [llo-topic_delay] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: ebb1d6e21cc92130 2025-10-14 11:41:14.356 INFO 928 --- [llo-topic_delay] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1760413274355 2025-10-14 11:41:14.356 INFO 928 --- [llo-topic_delay] c.t.s.e.p.k.consumer.KafkaConsumerTask : start to consumer kafka topic: vms_dlq_hello-topic 2025-10-14 11:41:14.356 INFO 928 --- [llo-topic_delay] c.t.s.e.p.k.c.AbstractTaskService : KafkaConsumerTask is running! topic:vms_dlq_hello-topic 2025-10-14 11:41:14.356 INFO 928 --- [_consumer_group] o.a.k.clients.consumer.KafkaConsumer : [Consumer clientId=consumer_10.13.35.30_76a6ca6f-b188-440c-88e5-aec400ec5ae7, groupId=delay_consumer_group] Subscribed to topic(s): delay_topic_level_2 2025-10-14 11:41:14.356 INFO 928 --- [llo-topic_delay] o.a.k.clients.consumer.KafkaConsumer : [Consumer clientId=consumer_10.13.35.30_05508b8f-0b32-45be-b949-9d790b839ad2, groupId=delay] Subscribed to topic(s): vms_dlq_hello-topic 2025-10-14 11:41:14.510 INFO 928 --- [_consumer_group] org.apache.kafka.clients.Metadata : [Consumer clientId=consumer_10.13.35.30_76a6ca6f-b188-440c-88e5-aec400ec5ae7, groupId=delay_consumer_group] Cluster ID: Cp8MopI8TC6QJ8pHpIkP9A 2025-10-14 11:41:14.510 INFO 928 --- [llo-topic_delay] org.apache.kafka.clients.Metadata : [Consumer clientId=consumer_10.13.35.30_05508b8f-0b32-45be-b949-9d790b839ad2, groupId=delay] Cluster ID: Cp8MopI8TC6QJ8pHpIkP9A 2025-10-14 11:41:14.510 INFO 928 --- [ad | producer-1] org.apache.kafka.clients.Metadata : [Producer clientId=producer-1] Cluster ID: Cp8MopI8TC6QJ8pHpIkP9A 2025-10-14 11:41:14.511 INFO 928 --- [llo-topic_delay] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer_10.13.35.30_05508b8f-0b32-45be-b949-9d790b839ad2, groupId=delay] Discovered group coordinator admin1-virtual-machine:9092 (id: 2147483647 rack: null) 2025-10-14 11:41:14.511 INFO 928 --- [_consumer_group] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer_10.13.35.30_76a6ca6f-b188-440c-88e5-aec400ec5ae7, groupId=delay_consumer_group] Discovered group coordinator admin1-virtual-machine:9092 (id: 2147483647 rack: null) 2025-10-14 11:41:14.940 INFO 928 --- [_consumer_group] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer_10.13.35.30_76a6ca6f-b188-440c-88e5-aec400ec5ae7, groupId=delay_consumer_group] (Re-)joining group 2025-10-14 11:41:14.940 INFO 928 --- [llo-topic_delay] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer_10.13.35.30_05508b8f-0b32-45be-b949-9d790b839ad2, groupId=delay] (Re-)joining group 2025-10-14 11:41:14.951 INFO 928 --- [_consumer_group] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer_10.13.35.30_76a6ca6f-b188-440c-88e5-aec400ec5ae7, groupId=delay_consumer_group] (Re-)joining group 2025-10-14 11:41:14.952 INFO 928 --- [llo-topic_delay] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer_10.13.35.30_05508b8f-0b32-45be-b949-9d790b839ad2, groupId=delay] (Re-)joining group 2025-10-14 11:41:14.953 INFO 928 --- [_consumer_group] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer_10.13.35.30_76a6ca6f-b188-440c-88e5-aec400ec5ae7, groupId=delay_consumer_group] Successfully joined group with generation Generation{generationId=27, memberId='consumer_10.13.35.30_76a6ca6f-b188-440c-88e5-aec400ec5ae7-13b3b8a8-053c-487e-9876-e6dc99701e48', protocol='cooperative-sticky'} 2025-10-14 11:41:14.953 INFO 928 --- [llo-topic_delay] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer_10.13.35.30_05508b8f-0b32-45be-b949-9d790b839ad2, groupId=delay] Successfully joined group with generation Generation{generationId=101, memberId='consumer_10.13.35.30_05508b8f-0b32-45be-b949-9d790b839ad2-a48b0811-3a89-47d4-aba1-081f0fe8f61f', protocol='cooperative-sticky'} 2025-10-14 11:41:14.954 INFO 928 --- [_consumer_group] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer_10.13.35.30_76a6ca6f-b188-440c-88e5-aec400ec5ae7, groupId=delay_consumer_group] Finished assignment for group at generation 27: {consumer_10.13.35.30_76a6ca6f-b188-440c-88e5-aec400ec5ae7-13b3b8a8-053c-487e-9876-e6dc99701e48=Assignment(partitions=[delay_topic_level_2-0])} 2025-10-14 11:41:14.954 INFO 928 --- [llo-topic_delay] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer_10.13.35.30_05508b8f-0b32-45be-b949-9d790b839ad2, groupId=delay] Finished assignment for group at generation 101: {consumer_10.13.35.30_05508b8f-0b32-45be-b949-9d790b839ad2-a48b0811-3a89-47d4-aba1-081f0fe8f61f=Assignment(partitions=[vms_dlq_hello-topic-0])} 2025-10-14 11:41:14.956 INFO 928 --- [llo-topic_delay] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer_10.13.35.30_05508b8f-0b32-45be-b949-9d790b839ad2, groupId=delay] Successfully synced group in generation Generation{generationId=101, memberId='consumer_10.13.35.30_05508b8f-0b32-45be-b949-9d790b839ad2-a48b0811-3a89-47d4-aba1-081f0fe8f61f', protocol='cooperative-sticky'} 2025-10-14 11:41:14.956 INFO 928 --- [_consumer_group] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer_10.13.35.30_76a6ca6f-b188-440c-88e5-aec400ec5ae7, groupId=delay_consumer_group] Successfully synced group in generation Generation{generationId=27, memberId='consumer_10.13.35.30_76a6ca6f-b188-440c-88e5-aec400ec5ae7-13b3b8a8-053c-487e-9876-e6dc99701e48', protocol='cooperative-sticky'} 2025-10-14 11:41:14.957 INFO 928 --- [llo-topic_delay] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer_10.13.35.30_05508b8f-0b32-45be-b949-9d790b839ad2, groupId=delay] Updating assignment with Assigned partitions: [vms_dlq_hello-topic-0] Current owned partitions: [] Added partitions (assigned - owned): [vms_dlq_hello-topic-0] Revoked partitions (owned - assigned): [] 2025-10-14 11:41:14.957 INFO 928 --- [llo-topic_delay] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer_10.13.35.30_05508b8f-0b32-45be-b949-9d790b839ad2, groupId=delay] Notifying assignor about the new Assignment(partitions=[vms_dlq_hello-topic-0]) 2025-10-14 11:41:14.957 INFO 928 --- [_consumer_group] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer_10.13.35.30_76a6ca6f-b188-440c-88e5-aec400ec5ae7, groupId=delay_consumer_group] Updating assignment with Assigned partitions: [delay_topic_level_2-0] Current owned partitions: [] Added partitions (assigned - owned): [delay_topic_level_2-0] Revoked partitions (owned - assigned): [] 2025-10-14 11:41:14.957 INFO 928 --- [_consumer_group] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer_10.13.35.30_76a6ca6f-b188-440c-88e5-aec400ec5ae7, groupId=delay_consumer_group] Notifying assignor about the new Assignment(partitions=[delay_topic_level_2-0]) 2025-10-14 11:41:14.958 INFO 928 --- [llo-topic_delay] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer_10.13.35.30_05508b8f-0b32-45be-b949-9d790b839ad2, groupId=delay] Adding newly assigned partitions: vms_dlq_hello-topic-0 2025-10-14 11:41:14.958 INFO 928 --- [_consumer_group] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer_10.13.35.30_76a6ca6f-b188-440c-88e5-aec400ec5ae7, groupId=delay_consumer_group] Adding newly assigned partitions: delay_topic_level_2-0 2025-10-14 11:41:14.958 INFO 928 --- [llo-topic_delay] com.tplink.smb.eventcenter.api.Handler : ending rebalance! 2025-10-14 11:41:14.958 INFO 928 --- [_consumer_group] com.tplink.smb.eventcenter.api.Handler : ending rebalance! 2025-10-14 11:41:14.963 INFO 928 --- [llo-topic_delay] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer_10.13.35.30_05508b8f-0b32-45be-b949-9d790b839ad2, groupId=delay] Setting offset for partition vms_dlq_hello-topic-0 to the committed offset FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[admin1-virtual-machine:9092 (id: 0 rack: null)], epoch=absent}} 2025-10-14 11:41:14.963 INFO 928 --- [_consumer_group] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer_10.13.35.30_76a6ca6f-b188-440c-88e5-aec400ec5ae7, groupId=delay_consumer_group] Setting offset for partition delay_topic_level_2-0 to the committed offset FetchPosition{offset=357, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[admin1-virtual-machine:9092 (id: 0 rack: null)], epoch=absent}}@Setter @Getter @AllArgsConstructor public class DelayEvent { private final String topic; private final String key; private final Integer partition; private final Event event; @JsonIgnore private final EventFuture eventFuture; private final long expirationTime; } public void registerDelayConsumer(String delayTopic, ExecutorService executorService) { String delayGroupId = "delay_consumer_group"; // 构造延迟处理器时,注入已有scheduledExecutorService DelayHandler delayHandler = new DelayHandler(this, new ScheduledThreadPoolExecutor(1)); // 调用原始注册逻辑,注册延迟主题的消费者 registerUnicastGenerically( delayTopic, // 监听的延迟主题(支持通配或具体前缀) delayGroupId, // 延迟消费者固定组ID delayHandler, executorService, DelayEvent.class// 使用延迟处理器 ); } }@Slf4j @RequiredArgsConstructor public class DelayHandler implements GenericEventHandler<DelayEvent> { private final KafkaEventCenter eventCenter; private final ScheduledExecutorService delayScheduler; private final Map<String, Boolean> pausedTopics = new ConcurrentHashMap<>(); @Override public void handleEvent(DelayEvent delayEvent) { long currentTime = System.currentTimeMillis(); long expirationTime = delayEvent.getExpirationTime(); String targetTopic = delayEvent.getTopic(); // 如果已过期直接转发 if (expirationTime <= currentTime) { eventCenter.forwardToTargetTopic(delayEvent); return; } // 未过期则暂停目标Topic消费 pauseTargetTopic(targetTopic); // 计算需要等待的时间 long delayMs = expirationTime - currentTime; // 注册延迟任务:到期后恢复Topic并转发事件 delayScheduler.schedule(() -> { log.info("触发延迟任务,目标主题={}, 当前时间={}", targetTopic, System.currentTimeMillis()); resumeTargetTopic(targetTopic); eventCenter.forwardToTargetTopic(delayEvent); }, delayMs, TimeUnit.MILLISECONDS); } /** * 暂停目标Topic的消费者 */ private void pauseTargetTopic(String targetTopic) { if (pausedTopics.putIfAbsent(targetTopic, true) == null) { try { eventCenter.pauseTopic(targetTopic); log.info("已暂停延迟Topic消费,targetTopic: {}", targetTopic); } catch (Exception e) { log.error("暂停Topic失败,targetTopic: {}", targetTopic, e); pausedTopics.remove(targetTopic); } } } /** * 恢复目标Topic的消费者 */ private void resumeTargetTopic(String targetTopic) { if (pausedTopics.remove(targetTopic) != null) { try { eventCenter.resumeTopic(targetTopic); log.info("已恢复延迟Topic消费,targetTopic: {}", targetTopic); } catch (Exception e) { log.error("恢复Topic失败,targetTopic: {}", targetTopic, e); } } } }@Slf4j @SpringBootApplication(exclude = {DataSourceAutoConfiguration.class, DataSourceTransactionManagerAutoConfiguration.class}) @ComponentScan(basePackages = { "com.tplink.nbu.demo.basicspringboot", "com.tplink.smb.eventcenter.port.kafka.deadletter", "com.tplink.smb.eventcenter.api.config" }) public class DelayConsume implements CommandLineRunner { @Autowired private KafkaEventCenter eventCenter; @Autowired private DLQConfig deadLetterConfig; private static final String EVENT_TOPIC = "delay_topic_level_2"; public static void main(String[] args) { SpringApplication.run(DelayConsume.class, args); } @Override public void run(String... args) throws Exception { registerDelayConsumer(); registerEventConsumer(); eventCenter.sendDelay("vms_dlq_hello-topic", "key1", 0, new Event("key1", "延迟触发消息"), 2000, new ForceMatchStrategy(), new EventFuture() { @Override public void onSuccess(EventCenterSendResult eventCenterSendResult) { } @Override public void onFailure(Throwable throwable) { } }); } private void registerDelayConsumer() { eventCenter.registerDelayConsumer(EVENT_TOPIC, Executors.newSingleThreadExecutor()); } private void registerEventConsumer() { EventHandler eventHandler = event -> { // 打印消息内容和消费时间戳,明确消费时机 log.info("[消费时间:{}] 触发消费,消息内容:{}", LocalDateTime.now().format(DateTimeFormatter.ISO_LOCAL_DATE_TIME), event.getMessage()); }; eventCenter.registerUnicast( "vms_dlq_hello-topic", "delay", eventHandler, Executors.newSingleThreadExecutor() ); } } 请继续分析修改,为何未触发消息转发
10-15
我的springsecurityconfig是这样写的,请根据上面的方案一改造一下,如果enablelocallogin为false,直接禁止: /* * Copyright (c) 2025, TP-Link. All rights reserved. */ package com.tplink.smb.common.data.management.system.modules.security.config; import com.tplink.smb.common.data.management.system.modules.security.security.JwtAccessDeniedHandler; import com.tplink.smb.common.data.management.system.modules.security.security.JwtAuthenticationEntryPoint; import com.tplink.smb.common.data.management.system.modules.security.security.TokenConfigurer; import com.tplink.smb.common.data.management.system.modules.security.security.TokenProvider; import com.tplink.smb.common.data.management.system.modules.security.service.OnlineUserService; import com.tplink.smb.common.data.management.system.utils.AnonTagUtils; import com.tplink.smb.common.data.management.system.utils.enums.RequestMethodEnum; import lombok.RequiredArgsConstructor; import com.tplink.smb.common.data.management.system.modules.security.security.*; import org.springframework.context.ApplicationContext; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.http.HttpMethod; import org.springframework.security.config.annotation.method.configuration.EnableGlobalMethodSecurity; import org.springframework.security.config.annotation.web.builders.HttpSecurity; import org.springframework.security.config.core.GrantedAuthorityDefaults; import org.springframework.security.config.http.SessionCreationPolicy; import org.springframework.security.crypto.bcrypt.BCryptPasswordEncoder; import org.springframework.security.crypto.password.PasswordEncoder; import org.springframework.security.web.SecurityFilterChain; import org.springframework.web.filter.CorsFilter; import java.util.*; /** * @author Chen Jiayuan * @version 1.0 * @since 2025/9/30 */ @Configuration @RequiredArgsConstructor @EnableGlobalMethodSecurity(prePostEnabled = true, securedEnabled = true) public class SpringSecurityConfig { private final TokenProvider tokenProvider; private final CorsFilter corsFilter; private final JwtAuthenticationEntryPoint authenticationErrorHandler; private final JwtAccessDeniedHandler jwtAccessDeniedHandler; private final ApplicationContext applicationContext; private final SecurityProperties properties; private final OnlineUserService onlineUserService; @Bean GrantedAuthorityDefaults grantedAuthorityDefaults() { // 去除 ROLE_ 前缀 return new GrantedAuthorityDefaults(""); } @Bean public PasswordEncoder passwordEncoder() { // 密码加密方式 return new BCryptPasswordEncoder(); } @Bean protected SecurityFilterChain filterChain(HttpSecurity httpSecurity) throws Exception { // 获取匿名标记 Map<String, Set<String>> anonymousUrls = AnonTagUtils.getAnonymousUrl(applicationContext); return httpSecurity // 禁用 CSRF .csrf() .disable() .addFilter(corsFilter) // 授权异常 .exceptionHandling() .authenticationEntryPoint(authenticationErrorHandler) .accessDeniedHandler(jwtAccessDeniedHandler) // 防止iframe 造成跨域 .and() .headers() .frameOptions() .disable() // 不创建会话 .and() .sessionManagement() .sessionCreationPolicy(SessionCreationPolicy.STATELESS) .and() .authorizeRequests() // 第三方登陆放行 .antMatchers(HttpMethod.GET, "/auth/github_login","/auth/azure_login") .permitAll() // 静态资源等等 .antMatchers( HttpMethod.GET, "/*.html", "/**/*.html", "/**/*.css", "/**/*.js", "/webSocket/**") .permitAll() // actuator .antMatchers("/actuator/**") .permitAll() // swagger 文档 .antMatchers("/swagger-ui.html") .permitAll() .antMatchers("/swagger-resources/**") .permitAll() .antMatchers("/webjars/**") .permitAll() .antMatchers("/*/api-docs") .permitAll() // 文件 .antMatchers("/avatar/**") .permitAll() .antMatchers("/file/**") .permitAll() // 阿里巴巴 druid .antMatchers("/druid/**") .permitAll() // 放行OPTIONS请求 .antMatchers(HttpMethod.OPTIONS, "/**") .permitAll() // 自定义匿名访问所有url放行:允许匿名和带Token访问,细腻化到每个 Request 类型 // GET .antMatchers( HttpMethod.GET, anonymousUrls.get(RequestMethodEnum.GET.getType()).toArray(new String[0])) .permitAll() // POST .antMatchers( HttpMethod.POST, anonymousUrls.get(RequestMethodEnum.POST.getType()).toArray(new String[0])) .permitAll() // PUT .antMatchers( HttpMethod.PUT, anonymousUrls.get(RequestMethodEnum.PUT.getType()).toArray(new String[0])) .permitAll() // PATCH .antMatchers( HttpMethod.PATCH, anonymousUrls.get(RequestMethodEnum.PATCH.getType()).toArray(new String[0])) .permitAll() // DELETE .antMatchers( HttpMethod.DELETE, anonymousUrls.get(RequestMethodEnum.DELETE.getType()).toArray(new String[0])) .permitAll() // 所有类型的接口都放行 .antMatchers(anonymousUrls.get(RequestMethodEnum.ALL.getType()).toArray(new String[0])) .permitAll() // 所有请求都需要认证 .anyRequest() .authenticated() .and() .apply(securityConfigurerAdapter()) .and() .build(); } private TokenConfigurer securityConfigurerAdapter() { return new TokenConfigurer(tokenProvider, properties, onlineUserService); } }
最新发布
11-27
2025-09-25 10:12:59.478 INFO 31516 — [ main] c.t.n.demo.basicspringboot.KafkaDemoApp : Starting KafkaDemoApp using Java 1.8.0_462-462 on 18088363-BG with PID 31516 (D:\r\idmdemo\target\classes started by admin in D:\r\idmdemo) 2025-09-25 10:12:59.480 INFO 31516 — [ main] c.t.n.demo.basicspringboot.KafkaDemoApp : No active profile set, falling back to 1 default profile: “default” 2025-09-25 10:13:00.103 INFO 31516 — [ main] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat initialized with port(s): 8080 (http) 2025-09-25 10:13:00.110 INFO 31516 — [ main] o.apache.catalina.core.StandardService : Starting service [Tomcat] 2025-09-25 10:13:00.110 INFO 31516 — [ main] org.apache.catalina.core.StandardEngine : Starting Servlet engine: [Apache Tomcat/9.0.71] 2025-09-25 10:13:00.170 INFO 31516 — [ main] o.a.c.c.C.[Tomcat].[localhost].[/] : Initializing Spring embedded WebApplicationContext 2025-09-25 10:13:00.170 INFO 31516 — [ main] w.s.c.ServletWebServerApplicationContext : Root WebApplicationContext: initialization completed in 668 ms 2025-09-25 10:13:00.594 INFO 31516 — [ main] o.s.b.a.e.web.EndpointLinksResolver : Exposing 1 endpoint(s) beneath base path ‘/actuator’ 2025-09-25 10:13:00.625 INFO 31516 — [ main] o.s.b.w.embedded.tomcat.TomcatWebServer : Tomcat started on port(s): 8080 (http) with context path ‘’ 2025-09-25 10:13:00.633 INFO 31516 — [ main] c.t.n.demo.basicspringboot.KafkaDemoApp : Started KafkaDemoApp in 1.371 seconds (JVM running for 1.656) 2025-09-25 10:13:00.648 INFO 31516 — [ main] c.t.s.e.port.kafka.KafkaEventCenter : start to register topic: hello-topic, groupId: demo-group 2025-09-25 10:13:00.654 INFO 31516 — [ main] c.t.s.e.port.kafka.KafkaEventCenter : start to register topic: vms_dlq_hello-topic, groupId: dlq-demo-group 2025-09-25 10:13:00.665 INFO 31516 — [_dlq-demo-group] o.a.k.clients.consumer.ConsumerConfig : ConsumerConfig values: allow.auto.create.topics = true auto.commit.interval.ms = 5000 auto.offset.reset = latest bootstrap.servers = [192.168.203.128:9092] check.crcs = true client.dns.lookup = use_all_dns_ips client.id = consumer_10.13.35.30_c7bcb7ca-7123-40ce-99e3-8228f4657299 client.rack = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = dlq-demo-group group.instance.id = null heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true internal.throw.on.fetch.stable.offset.unsupported = false isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 200 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [org.apache.kafka.clients.consumer.CooperativeStickyAssignor] receive.buffer.bytes = -1 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 session.timeout.ms = 45000 socket.connection.setup.timeout.max.ms = 30000 socket.connection.setup.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.certificate.chain = null ssl.keystore.key = null ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.2 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.certificates = null ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer 2025-09-25 10:13:00.665 INFO 31516 — [opic_demo-group] o.a.k.clients.consumer.ConsumerConfig : ConsumerConfig values: allow.auto.create.topics = true auto.commit.interval.ms = 5000 auto.offset.reset = latest bootstrap.servers = [192.168.203.128:9092] check.crcs = true client.dns.lookup = use_all_dns_ips client.id = consumer_10.13.35.30_84858b4f-b0ce-4324-90e8-3ceb81cfcf65 client.rack = connections.max.idle.ms = 540000 default.api.timeout.ms = 60000 enable.auto.commit = false exclude.internal.topics = true fetch.max.bytes = 52428800 fetch.max.wait.ms = 500 fetch.min.bytes = 1 group.id = demo-group group.instance.id = null heartbeat.interval.ms = 3000 interceptor.classes = [] internal.leave.group.on.close = true internal.throw.on.fetch.stable.offset.unsupported = false isolation.level = read_uncommitted key.deserializer = class org.apache.kafka.common.serialization.StringDeserializer max.partition.fetch.bytes = 1048576 max.poll.interval.ms = 300000 max.poll.records = 200 metadata.max.age.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partition.assignment.strategy = [org.apache.kafka.clients.consumer.CooperativeStickyAssignor] receive.buffer.bytes = -1 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 session.timeout.ms = 45000 socket.connection.setup.timeout.max.ms = 30000 socket.connection.setup.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.certificate.chain = null ssl.keystore.key = null ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.2 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.certificates = null ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS value.deserializer = class org.apache.kafka.common.serialization.ByteArrayDeserializer 2025-09-25 10:13:00.678 INFO 31516 — [ main] o.a.k.clients.producer.ProducerConfig : ProducerConfig values: acks = -1 batch.size = 4096 bootstrap.servers = [192.168.203.128:9092] buffer.memory = 33554432 client.dns.lookup = use_all_dns_ips client.id = producer-1 compression.type = none connections.max.idle.ms = 540000 delivery.timeout.ms = 120000 enable.idempotence = false interceptor.classes = [] internal.auto.downgrade.txn.commit = false key.serializer = class org.apache.kafka.common.serialization.StringSerializer linger.ms = 1 max.block.ms = 60000 max.in.flight.requests.per.connection = 5 max.request.size = 1048576 metadata.max.age.ms = 300000 metadata.max.idle.ms = 300000 metric.reporters = [] metrics.num.samples = 2 metrics.recording.level = INFO metrics.sample.window.ms = 30000 partitioner.class = class org.apache.kafka.clients.producer.internals.DefaultPartitioner receive.buffer.bytes = 32768 reconnect.backoff.max.ms = 1000 reconnect.backoff.ms = 50 request.timeout.ms = 30000 retries = 2147483647 retry.backoff.ms = 100 sasl.client.callback.handler.class = null sasl.jaas.config = null sasl.kerberos.kinit.cmd = /usr/bin/kinit sasl.kerberos.min.time.before.relogin = 60000 sasl.kerberos.service.name = null sasl.kerberos.ticket.renew.jitter = 0.05 sasl.kerberos.ticket.renew.window.factor = 0.8 sasl.login.callback.handler.class = null sasl.login.class = null sasl.login.refresh.buffer.seconds = 300 sasl.login.refresh.min.period.seconds = 60 sasl.login.refresh.window.factor = 0.8 sasl.login.refresh.window.jitter = 0.05 sasl.mechanism = GSSAPI security.protocol = PLAINTEXT security.providers = null send.buffer.bytes = 131072 socket.connection.setup.timeout.max.ms = 30000 socket.connection.setup.timeout.ms = 10000 ssl.cipher.suites = null ssl.enabled.protocols = [TLSv1.2] ssl.endpoint.identification.algorithm = https ssl.engine.factory.class = null ssl.key.password = null ssl.keymanager.algorithm = SunX509 ssl.keystore.certificate.chain = null ssl.keystore.key = null ssl.keystore.location = null ssl.keystore.password = null ssl.keystore.type = JKS ssl.protocol = TLSv1.2 ssl.provider = null ssl.secure.random.implementation = null ssl.trustmanager.algorithm = PKIX ssl.truststore.certificates = null ssl.truststore.location = null ssl.truststore.password = null ssl.truststore.type = JKS transaction.timeout.ms = 60000 transactional.id = null value.serializer = class org.apache.kafka.common.serialization.ByteArraySerializer 2025-09-25 10:13:00.700 INFO 31516 — [ main] o.a.kafka.common.utils.AppInfoParser : Kafka version: 2.8.0 2025-09-25 10:13:00.700 INFO 31516 — [ main] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: ebb1d6e21cc92130 2025-09-25 10:13:00.700 INFO 31516 — [ main] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1758766380699 2025-09-25 10:13:00.711 INFO 31516 — [opic_demo-group] o.a.kafka.common.utils.AppInfoParser : Kafka version: 2.8.0 2025-09-25 10:13:00.711 INFO 31516 — [opic_demo-group] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: ebb1d6e21cc92130 2025-09-25 10:13:00.711 INFO 31516 — [opic_demo-group] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1758766380711 2025-09-25 10:13:00.711 INFO 31516 — [opic_demo-group] c.t.s.e.p.k.consumer.KafkaConsumerTask : start to consumer kafka topic: hello-topic 2025-09-25 10:13:00.711 INFO 31516 — [opic_demo-group] c.t.s.e.p.k.c.AbstractTaskService : KafkaConsumerTask is running! topic:hello-topic 2025-09-25 10:13:00.711 INFO 31516 — [_dlq-demo-group] o.a.kafka.common.utils.AppInfoParser : Kafka version: 2.8.0 2025-09-25 10:13:00.711 INFO 31516 — [_dlq-demo-group] o.a.kafka.common.utils.AppInfoParser : Kafka commitId: ebb1d6e21cc92130 2025-09-25 10:13:00.711 INFO 31516 — [_dlq-demo-group] o.a.kafka.common.utils.AppInfoParser : Kafka startTimeMs: 1758766380711 2025-09-25 10:13:00.711 INFO 31516 — [_dlq-demo-group] c.t.s.e.p.k.consumer.KafkaConsumerTask : start to consumer kafka topic: vms_dlq_hello-topic 2025-09-25 10:13:00.711 INFO 31516 — [_dlq-demo-group] c.t.s.e.p.k.c.AbstractTaskService : KafkaConsumerTask is running! topic:vms_dlq_hello-topic 2025-09-25 10:13:00.711 INFO 31516 — [_dlq-demo-group] o.a.k.clients.consumer.KafkaConsumer : [Consumer clientId=consumer_10.13.35.30_c7bcb7ca-7123-40ce-99e3-8228f4657299, groupId=dlq-demo-group] Subscribed to topic(s): vms_dlq_hello-topic 2025-09-25 10:13:00.711 INFO 31516 — [opic_demo-group] o.a.k.clients.consumer.KafkaConsumer : [Consumer clientId=consumer_10.13.35.30_84858b4f-b0ce-4324-90e8-3ceb81cfcf65, groupId=demo-group] Subscribed to topic(s): hello-topic 2025-09-25 10:13:00.863 INFO 31516 — [ad | producer-1] org.apache.kafka.clients.Metadata : [Producer clientId=producer-1] Cluster ID: ZzP7spDrRwuJk27muhJ29g 2025-09-25 10:13:00.863 INFO 31516 — [opic_demo-group] org.apache.kafka.clients.Metadata : [Consumer clientId=consumer_10.13.35.30_84858b4f-b0ce-4324-90e8-3ceb81cfcf65, groupId=demo-group] Cluster ID: ZzP7spDrRwuJk27muhJ29g 2025-09-25 10:13:00.863 INFO 31516 — [_dlq-demo-group] org.apache.kafka.clients.Metadata : [Consumer clientId=consumer_10.13.35.30_c7bcb7ca-7123-40ce-99e3-8228f4657299, groupId=dlq-demo-group] Cluster ID: ZzP7spDrRwuJk27muhJ29g 2025-09-25 10:13:00.863 INFO 31516 — [_dlq-demo-group] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer_10.13.35.30_c7bcb7ca-7123-40ce-99e3-8228f4657299, groupId=dlq-demo-group] Discovered group coordinator admin1-virtual-machine:9092 (id: 2147483647 rack: null) 2025-09-25 10:13:00.863 INFO 31516 — [opic_demo-group] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer_10.13.35.30_84858b4f-b0ce-4324-90e8-3ceb81cfcf65, groupId=demo-group] Discovered group coordinator admin1-virtual-machine:9092 (id: 2147483647 rack: null) 2025-09-25 10:13:01.300 INFO 31516 — [_dlq-demo-group] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer_10.13.35.30_c7bcb7ca-7123-40ce-99e3-8228f4657299, groupId=dlq-demo-group] (Re-)joining group 2025-09-25 10:13:01.300 INFO 31516 — [opic_demo-group] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer_10.13.35.30_84858b4f-b0ce-4324-90e8-3ceb81cfcf65, groupId=demo-group] (Re-)joining group 2025-09-25 10:13:01.320 INFO 31516 — [opic_demo-group] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer_10.13.35.30_84858b4f-b0ce-4324-90e8-3ceb81cfcf65, groupId=demo-group] (Re-)joining group 2025-09-25 10:13:01.320 INFO 31516 — [_dlq-demo-group] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer_10.13.35.30_c7bcb7ca-7123-40ce-99e3-8228f4657299, groupId=dlq-demo-group] (Re-)joining group 2025-09-25 10:13:01.323 INFO 31516 — [_dlq-demo-group] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer_10.13.35.30_c7bcb7ca-7123-40ce-99e3-8228f4657299, groupId=dlq-demo-group] Successfully joined group with generation Generation{generationId=30, memberId=‘consumer_10.13.35.30_c7bcb7ca-7123-40ce-99e3-8228f4657299-67bf5860-c183-4894-acbb-8837abf647f6’, protocol=‘cooperative-sticky’} 2025-09-25 10:13:01.323 INFO 31516 — [opic_demo-group] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer_10.13.35.30_84858b4f-b0ce-4324-90e8-3ceb81cfcf65, groupId=demo-group] Successfully joined group with generation Generation{generationId=31, memberId=‘consumer_10.13.35.30_84858b4f-b0ce-4324-90e8-3ceb81cfcf65-c3379cf9-80c6-40cc-a872-71b04e2f7b27’, protocol=‘cooperative-sticky’} 2025-09-25 10:13:01.324 INFO 31516 — [_dlq-demo-group] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer_10.13.35.30_c7bcb7ca-7123-40ce-99e3-8228f4657299, groupId=dlq-demo-group] Finished assignment for group at generation 30: {consumer_10.13.35.30_c7bcb7ca-7123-40ce-99e3-8228f4657299-67bf5860-c183-4894-acbb-8837abf647f6=Assignment(partitions=[vms_dlq_hello-topic-0])} 2025-09-25 10:13:01.324 INFO 31516 — [opic_demo-group] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer_10.13.35.30_84858b4f-b0ce-4324-90e8-3ceb81cfcf65, groupId=demo-group] Finished assignment for group at generation 31: {consumer_10.13.35.30_84858b4f-b0ce-4324-90e8-3ceb81cfcf65-c3379cf9-80c6-40cc-a872-71b04e2f7b27=Assignment(partitions=[hello-topic-0])} 2025-09-25 10:13:01.327 INFO 31516 — [opic_demo-group] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer_10.13.35.30_84858b4f-b0ce-4324-90e8-3ceb81cfcf65, groupId=demo-group] Successfully synced group in generation Generation{generationId=31, memberId=‘consumer_10.13.35.30_84858b4f-b0ce-4324-90e8-3ceb81cfcf65-c3379cf9-80c6-40cc-a872-71b04e2f7b27’, protocol=‘cooperative-sticky’} 2025-09-25 10:13:01.327 INFO 31516 — [opic_demo-group] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer_10.13.35.30_84858b4f-b0ce-4324-90e8-3ceb81cfcf65, groupId=demo-group] Updating assignment with Assigned partitions: [hello-topic-0] Current owned partitions: [] Added partitions (assigned - owned): [hello-topic-0] Revoked partitions (owned - assigned): [] 2025-09-25 10:13:01.327 INFO 31516 — [opic_demo-group] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer_10.13.35.30_84858b4f-b0ce-4324-90e8-3ceb81cfcf65, groupId=demo-group] Notifying assignor about the new Assignment(partitions=[hello-topic-0]) 2025-09-25 10:13:01.328 INFO 31516 — [_dlq-demo-group] o.a.k.c.c.internals.AbstractCoordinator : [Consumer clientId=consumer_10.13.35.30_c7bcb7ca-7123-40ce-99e3-8228f4657299, groupId=dlq-demo-group] Successfully synced group in generation Generation{generationId=30, memberId=‘consumer_10.13.35.30_c7bcb7ca-7123-40ce-99e3-8228f4657299-67bf5860-c183-4894-acbb-8837abf647f6’, protocol=‘cooperative-sticky’} 2025-09-25 10:13:01.328 INFO 31516 — [_dlq-demo-group] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer_10.13.35.30_c7bcb7ca-7123-40ce-99e3-8228f4657299, groupId=dlq-demo-group] Updating assignment with Assigned partitions: [vms_dlq_hello-topic-0] Current owned partitions: [] Added partitions (assigned - owned): [vms_dlq_hello-topic-0] Revoked partitions (owned - assigned): [] 2025-09-25 10:13:01.328 INFO 31516 — [_dlq-demo-group] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer_10.13.35.30_c7bcb7ca-7123-40ce-99e3-8228f4657299, groupId=dlq-demo-group] Notifying assignor about the new Assignment(partitions=[vms_dlq_hello-topic-0]) 2025-09-25 10:13:01.328 INFO 31516 — [opic_demo-group] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer_10.13.35.30_84858b4f-b0ce-4324-90e8-3ceb81cfcf65, groupId=demo-group] Adding newly assigned partitions: hello-topic-0 2025-09-25 10:13:01.328 INFO 31516 — [_dlq-demo-group] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer_10.13.35.30_c7bcb7ca-7123-40ce-99e3-8228f4657299, groupId=dlq-demo-group] Adding newly assigned partitions: vms_dlq_hello-topic-0 2025-09-25 10:13:01.328 INFO 31516 — [opic_demo-group] com.tplink.smb.eventcenter.api.Handler : ending rebalance! 2025-09-25 10:13:01.328 INFO 31516 — [_dlq-demo-group] com.tplink.smb.eventcenter.api.Handler : ending rebalance! 2025-09-25 10:13:01.333 INFO 31516 — [opic_demo-group] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer_10.13.35.30_84858b4f-b0ce-4324-90e8-3ceb81cfcf65, groupId=demo-group] Setting offset for partition hello-topic-0 to the committed offset FetchPosition{offset=17, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[admin1-virtual-machine:9092 (id: 0 rack: null)], epoch=absent}} 2025-09-25 10:13:01.333 INFO 31516 — [_dlq-demo-group] o.a.k.c.c.internals.ConsumerCoordinator : [Consumer clientId=consumer_10.13.35.30_c7bcb7ca-7123-40ce-99e3-8228f4657299, groupId=dlq-demo-group] Setting offset for partition vms_dlq_hello-topic-0 to the committed offset FetchPosition{offset=0, offsetEpoch=Optional.empty, currentLeader=LeaderAndEpoch{leader=Optional[admin1-virtual-machine:9092 (id: 0 rack: null)], epoch=absent}} 尝试处理消息: Hello Kafka! 这是一条会触发死信的消息 2025-09-25 10:13:01.371 WARN 31516 — [nPool-worker-22] c.t.s.e.p.k.d.DLQEventHandlerWrapper : Event handle failed (attempt 1/2): 模拟业务处理失败 2025-09-25 10:13:01.371 INFO 31516 — [nPool-worker-22] c.t.s.e.p.k.d.DLQEventHandlerWrapper : Scheduled retry delay: delay=1000ms 尝试处理消息: Hello Kafka! 这是一条会触发死信的消息 2025-09-25 10:13:02.379 WARN 31516 — [nPool-worker-22] c.t.s.e.p.k.d.DLQEventHandlerWrapper : Event handle failed (attempt 2/2): 模拟业务处理失败 2025-09-25 10:13:02.381 INFO 31516 — [nPool-worker-22] c.t.s.e.port.kafka.KafkaEventCenter : not implement yet 2025-09-25 10:13:02.382 INFO 31516 — [nPool-worker-22] c.t.s.e.p.k.d.DLQEventHandlerWrapper : Eventsent to DLQ topic: vms_dlq_hello-topic 2025-09-25 10:13:02.382 ERROR 31516 — [nPool-worker-22] c.t.s.e.p.k.d.DLQEventHandlerWrapper : Event failed after 2 retries (sent to DLQ)请根据这段日志分析,死信队列如果想获取offset和目标topic应该在哪里侵入
09-26
评论
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符  | 博主筛选后可见
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值