dubbo报错问题解决,最终发现是redis process执行并发问题,引起服务器挂机
开始出现报错是
Thread pool is EXHAUSTED!Thread Name: %s, Pool Size: 200 (active: 200, core: 200, max: 200, largest: 200
从报错信息看是dubbo的线程池爆满了。
但是原因是什么呢,然后设置各种并发参数等优化,但是问题还是未得到解决。
然后再次判断为线程池的线程一直阻塞未能释放出来执行其他任务,那为什么线程一直处于阻塞状态,
按照正常线程池的线程执行完一个任务后就会变为空闲等待(底层代码实现是while循环一直判断是否有要执行的任务和缓存队列中是否有要执行的任务)
然后就利用jstack pid > file文件名 输出来看看dataServer服务的线程使用情况,发现一大堆redis报错。
"laizhan-fixed-thread-pool-thread-183" #405 daemon prio=5 os_prio=0 tid=0x0000000021448800 nid=0x3f64 waiting on condition [0x000000003ea2c000]
java.lang.Thread.State: WAITING (parking)
at sun.misc.Unsafe.park(Native Method)
- parking to wait for <0x0000000082f9df40> (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039)
at org.apache.commons.pool2.impl.LinkedBlockingDeque.takeFirst(LinkedBlockingDeque.java:583)
at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:442)
at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:363)
at redis.clients.util.Pool.getResource(Pool.java:48)
at redis.clients.jedis.JedisPool.getResource(JedisPool.java:99)
at redis.clients.jedis.JedisPool.getResource(JedisPool.java:12)
at org.springframework.data.redis.connection.jedis.JedisConnectionFactory.fetchJedisConnector(JedisConnectionFactory.java:155)
at org.springframework.data.redis.connection.jedis.JedisConnectionFactory.getConnection(JedisConnectionFactory.java:251)
at org.springframework.data.redis.connection.jedis.JedisConnectionFactory$$FastClassBySpringCGLIB$$648e8c34.invoke(<generated>)
at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:204)
at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:651)
at org.springframework.data.redis.connection.jedis.JedisConnectionFactory$$EnhancerBySpringCGLIB$$852a9acd.getConnection(<generated>)
at tv.laizhan.dataSvr.helper.RedisHelper.getJedis(RedisHelper.java:66)
at tv.laizhan.dataSvr.helper.RedisHelper.process(RedisHelper.java:76)
at tv.laizhan.dataSvr.helper.RedisHelper.process(RedisHelper.java:70)
at tv.laizhan.dataSvr.redis.ConfigRedis.findByKeyname(ConfigRedis.java:27)
at tv.laizhan.dataSvr.center.ConfigCenter.findByKeyname(ConfigCenter.java:34)
at tv.laizhan.dataSvr.conf.RedisConfig.getOneDay_exp(RedisConfig.java:108)
at tv.laizhan.dataSvr.redis.UserRedis.lambda$setExpVal$2(UserRedis.java:49)
at tv.laizhan.dataSvr.redis.UserRedis$$Lambda$55/340856506.process(Unknown Source)
at tv.laizhan.dataSvr.helper.RedisHelper.process(RedisHelper.java:78)
at tv.laizhan.dataSvr.helper.RedisHelper.process(RedisHelper.java:70)
at tv.laizhan.dataSvr.redis.UserRedis.setExpVal(UserRedis.java:47)
at tv.laizhan.dataSvr.center.UserCenter.scoreExpVal(UserCenter.java:299)
at tv.laizhan.dataSvr.affair.LoginAffair.lambda$login$0(LoginAffair.java:131)
at tv.laizhan.dataSvr.affair.LoginAffair$$Lambda$18/1993346104.process(Unknown Source)
at tv.laizhan.dataSvr.affair.AbsAffair.process(AbsAffair.java:38)
at tv.laizhan.dataSvr.affair.AbsAffair.process(AbsAffair.java:30)
at tv.laizhan.dataSvr.affair.LoginAffair.login(LoginAffair.java:87)
at com.alibaba.dubbo.common.bytecode.Wrapper29.invokeMethod(Wrapper29.java)
at com.alibaba.dubbo.rpc.proxy.javassist.JavassistProxyFactory$1.doInvoke(JavassistProxyFactory.java:46)
at com.alibaba.dubbo.rpc.proxy.AbstractProxyInvoker.invoke(AbstractProxyInvoker.java:72)
at com.alibaba.dubbo.rpc.protocol.InvokerWrapper.invoke(InvokerWrapper.java:53)
at com.alibaba.dubbo.rpc.filter.ExceptionFilter.invoke(ExceptionFilter.java:64)
at com.alibaba.dubbo.rpc.protocol.ProtocolFilterWrapper$1.invoke(ProtocolFilterWrapper.java:91)
at com.alibaba.dubbo.monitor.support.MonitorFilter.invoke(MonitorFilter.java:65)
at com.alibaba.dubbo.rpc.protocol.ProtocolFilterWrapper$1.invoke(ProtocolFilterWrapper.java:91)
at com.alibaba.dubbo.rpc.filter.TimeoutFilter.invoke(TimeoutFilter.java:42)
at com.alibaba.dubbo.rpc.protocol.ProtocolFilterWrapper$1.invoke(ProtocolFilterWrapper.java:91)
at com.alibaba.dubbo.rpc.protocol.dubbo.filter.TraceFilter.invoke(TraceFilter.java:78)
at com.alibaba.dubbo.rpc.protocol.ProtocolFilterWrapper$1.invoke(ProtocolFilterWrapper.java:91)
at com.alibaba.dubbo.rpc.filter.ContextFilter.invoke(ContextFilter.java:70)
at com.alibaba.dubbo.rpc.protocol.ProtocolFilterWrapper$1.invoke(ProtocolFilterWrapper.java:91)
at com.alibaba.dubbo.rpc.filter.GenericFilter.invoke(GenericFilter.java:132)
at com.alibaba.dubbo.rpc.protocol.ProtocolFilterWrapper$1.invoke(ProtocolFilterWrapper.java:91)
at com.alibaba.dubbo.rpc.filter.ClassLoaderFilter.invoke(ClassLoaderFilter.java:38)
at com.alibaba.dubbo.rpc.protocol.ProtocolFilterWrapper$1.invoke(ProtocolFilterWrapper.java:91)
at com.alibaba.dubbo.rpc.filter.EchoFilter.invoke(EchoFilter.java:38)
at com.alibaba.dubbo.rpc.protocol.ProtocolFilterWrapper$1.invoke(ProtocolFilterWrapper.java:91)
at com.alibaba.dubbo.rpc.protocol.dubbo.DubboProtocol$1.reply(DubboProtocol.java:113)
at com.alibaba.dubbo.remoting.exchange.support.header.HeaderExchangeHandler.handleRequest(HeaderExchangeHandler.java:84)
at com.alibaba.dubbo.remoting.exchange.support.header.HeaderExchangeHandler.received(HeaderExchangeHandler.java:170)
at com.alibaba.dubbo.remoting.transport.DecodeHandler.received(DecodeHandler.java:52)
at com.alibaba.dubbo.remoting.transport.dispatcher.ChannelEventRunnable.run(ChannelEventRunnable.java:82)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
然后就往redis方向找,优化了下配置。
然后还是不行,最后尝试在redis获取链接并执行回收加上synchronized
public synchronized <T> T process(IRedisFunction<T> function, IRedisErrorFunction<T> errorFunction) {
T reslt = null;
Exception ee = null;
Jedis jedis = getJedis();
try {
reslt = function.process(jedis);
}
code代码
}
然后再各种并发压力测试,果然一切正常了不报错了。
分析应该是 redis获取链接并执行回收的过程并发导致redis占用的线程死锁,骨牌效应之后的线程连续出现死锁,
最后导致dubbo线程池爆满拒绝所有到来的请求,服务器就卡死无法服务了。
开始出现报错是
Thread pool is EXHAUSTED!Thread Name: %s, Pool Size: 200 (active: 200, core: 200, max: 200, largest: 200
从报错信息看是dubbo的线程池爆满了。
但是原因是什么呢,然后设置各种并发参数等优化,但是问题还是未得到解决。
然后再次判断为线程池的线程一直阻塞未能释放出来执行其他任务,那为什么线程一直处于阻塞状态,
按照正常线程池的线程执行完一个任务后就会变为空闲等待(底层代码实现是while循环一直判断是否有要执行的任务和缓存队列中是否有要执行的任务)
然后就利用jstack pid > file文件名 输出来看看dataServer服务的线程使用情况,发现一大堆redis报错。
"laizhan-fixed-thread-pool-thread-183" #405 daemon prio=5 os_prio=0 tid=0x0000000021448800 nid=0x3f64 waiting on condition [0x000000003ea2c000]
java.lang.Thread.State: WAITING (parking)
at sun.misc.Unsafe.park(Native Method)
- parking to wait for <0x0000000082f9df40> (a java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject)
at java.util.concurrent.locks.LockSupport.park(LockSupport.java:175)
at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.await(AbstractQueuedSynchronizer.java:2039)
at org.apache.commons.pool2.impl.LinkedBlockingDeque.takeFirst(LinkedBlockingDeque.java:583)
at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:442)
at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:363)
at redis.clients.util.Pool.getResource(Pool.java:48)
at redis.clients.jedis.JedisPool.getResource(JedisPool.java:99)
at redis.clients.jedis.JedisPool.getResource(JedisPool.java:12)
at org.springframework.data.redis.connection.jedis.JedisConnectionFactory.fetchJedisConnector(JedisConnectionFactory.java:155)
at org.springframework.data.redis.connection.jedis.JedisConnectionFactory.getConnection(JedisConnectionFactory.java:251)
at org.springframework.data.redis.connection.jedis.JedisConnectionFactory$$FastClassBySpringCGLIB$$648e8c34.invoke(<generated>)
at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:204)
at org.springframework.aop.framework.CglibAopProxy$DynamicAdvisedInterceptor.intercept(CglibAopProxy.java:651)
at org.springframework.data.redis.connection.jedis.JedisConnectionFactory$$EnhancerBySpringCGLIB$$852a9acd.getConnection(<generated>)
at tv.laizhan.dataSvr.helper.RedisHelper.getJedis(RedisHelper.java:66)
at tv.laizhan.dataSvr.helper.RedisHelper.process(RedisHelper.java:76)
at tv.laizhan.dataSvr.helper.RedisHelper.process(RedisHelper.java:70)
at tv.laizhan.dataSvr.redis.ConfigRedis.findByKeyname(ConfigRedis.java:27)
at tv.laizhan.dataSvr.center.ConfigCenter.findByKeyname(ConfigCenter.java:34)
at tv.laizhan.dataSvr.conf.RedisConfig.getOneDay_exp(RedisConfig.java:108)
at tv.laizhan.dataSvr.redis.UserRedis.lambda$setExpVal$2(UserRedis.java:49)
at tv.laizhan.dataSvr.redis.UserRedis$$Lambda$55/340856506.process(Unknown Source)
at tv.laizhan.dataSvr.helper.RedisHelper.process(RedisHelper.java:78)
at tv.laizhan.dataSvr.helper.RedisHelper.process(RedisHelper.java:70)
at tv.laizhan.dataSvr.redis.UserRedis.setExpVal(UserRedis.java:47)
at tv.laizhan.dataSvr.center.UserCenter.scoreExpVal(UserCenter.java:299)
at tv.laizhan.dataSvr.affair.LoginAffair.lambda$login$0(LoginAffair.java:131)
at tv.laizhan.dataSvr.affair.LoginAffair$$Lambda$18/1993346104.process(Unknown Source)
at tv.laizhan.dataSvr.affair.AbsAffair.process(AbsAffair.java:38)
at tv.laizhan.dataSvr.affair.AbsAffair.process(AbsAffair.java:30)
at tv.laizhan.dataSvr.affair.LoginAffair.login(LoginAffair.java:87)
at com.alibaba.dubbo.common.bytecode.Wrapper29.invokeMethod(Wrapper29.java)
at com.alibaba.dubbo.rpc.proxy.javassist.JavassistProxyFactory$1.doInvoke(JavassistProxyFactory.java:46)
at com.alibaba.dubbo.rpc.proxy.AbstractProxyInvoker.invoke(AbstractProxyInvoker.java:72)
at com.alibaba.dubbo.rpc.protocol.InvokerWrapper.invoke(InvokerWrapper.java:53)
at com.alibaba.dubbo.rpc.filter.ExceptionFilter.invoke(ExceptionFilter.java:64)
at com.alibaba.dubbo.rpc.protocol.ProtocolFilterWrapper$1.invoke(ProtocolFilterWrapper.java:91)
at com.alibaba.dubbo.monitor.support.MonitorFilter.invoke(MonitorFilter.java:65)
at com.alibaba.dubbo.rpc.protocol.ProtocolFilterWrapper$1.invoke(ProtocolFilterWrapper.java:91)
at com.alibaba.dubbo.rpc.filter.TimeoutFilter.invoke(TimeoutFilter.java:42)
at com.alibaba.dubbo.rpc.protocol.ProtocolFilterWrapper$1.invoke(ProtocolFilterWrapper.java:91)
at com.alibaba.dubbo.rpc.protocol.dubbo.filter.TraceFilter.invoke(TraceFilter.java:78)
at com.alibaba.dubbo.rpc.protocol.ProtocolFilterWrapper$1.invoke(ProtocolFilterWrapper.java:91)
at com.alibaba.dubbo.rpc.filter.ContextFilter.invoke(ContextFilter.java:70)
at com.alibaba.dubbo.rpc.protocol.ProtocolFilterWrapper$1.invoke(ProtocolFilterWrapper.java:91)
at com.alibaba.dubbo.rpc.filter.GenericFilter.invoke(GenericFilter.java:132)
at com.alibaba.dubbo.rpc.protocol.ProtocolFilterWrapper$1.invoke(ProtocolFilterWrapper.java:91)
at com.alibaba.dubbo.rpc.filter.ClassLoaderFilter.invoke(ClassLoaderFilter.java:38)
at com.alibaba.dubbo.rpc.protocol.ProtocolFilterWrapper$1.invoke(ProtocolFilterWrapper.java:91)
at com.alibaba.dubbo.rpc.filter.EchoFilter.invoke(EchoFilter.java:38)
at com.alibaba.dubbo.rpc.protocol.ProtocolFilterWrapper$1.invoke(ProtocolFilterWrapper.java:91)
at com.alibaba.dubbo.rpc.protocol.dubbo.DubboProtocol$1.reply(DubboProtocol.java:113)
at com.alibaba.dubbo.remoting.exchange.support.header.HeaderExchangeHandler.handleRequest(HeaderExchangeHandler.java:84)
at com.alibaba.dubbo.remoting.exchange.support.header.HeaderExchangeHandler.received(HeaderExchangeHandler.java:170)
at com.alibaba.dubbo.remoting.transport.DecodeHandler.received(DecodeHandler.java:52)
at com.alibaba.dubbo.remoting.transport.dispatcher.ChannelEventRunnable.run(ChannelEventRunnable.java:82)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
然后就往redis方向找,优化了下配置。
然后还是不行,最后尝试在redis获取链接并执行回收加上synchronized
public synchronized <T> T process(IRedisFunction<T> function, IRedisErrorFunction<T> errorFunction) {
T reslt = null;
Exception ee = null;
Jedis jedis = getJedis();
try {
reslt = function.process(jedis);
}
code代码
}
然后再各种并发压力测试,果然一切正常了不报错了。
分析应该是 redis获取链接并执行回收的过程并发导致redis占用的线程死锁,骨牌效应之后的线程连续出现死锁,
最后导致dubbo线程池爆满拒绝所有到来的请求,服务器就卡死无法服务了。