Druid:Druid.io 部署&使用文档

本文档详细介绍Druid.io的大数据处理平台部署步骤,包括配置Java环境、Zookeeper、MySQL及HDFS等,同时提供各组件的启动命令及关键配置项说明。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

Druid:Druid.io 部署&使用文档

时间  2016-12-23

标签 海量数据 栏目 Java开源

原文   http://blog.youkuaiyun.com/silentwolfyh/article/details/53843335

 

参考:


http://lxw1234.com/archives/2015/11/554.htm 海量数据实时OLAP分析系统-Druid.io安装配置和体验 
http://druid.io/docs/0.9.2/design/design.html Druid官网搭建

Druid.io 部署&使用文档


1.集群模式下部署

Prerequisites : Java 7 or higher & Zookeeper & mysql

下载Druid.io :

curl -O http://static.druid.io/artifacts/releases/druid-0.9.1.1-bin.tar.gz
tar -xzf druid-0.9.1.1-bin.tar.gz
cd druid-0.9.1.1

文件夹目录结构 :

  • LICENSE - the license files.
  • bin/ - scripts related to the single-machine quickstart.
  • conf/* - template configurations for a clustered setup.
  • conf-quickstart/* - configurations for the single-machine quickstart.
  • extensions/* - all Druid extensions.
  • hadoop-dependencies/* - Druid Hadoop dependencies.
  • lib/* - all included software packages for core Druid.
  • quickstart/* - files related to the single-machine quickstart.

所有配置文件均在 conf/* 目录下.

配饰HDFS为Druid.io的deep storage & 配置zk & 配置mysql

修改 conf/druid/_common/common.runtime.properties 文件.

#
# Extensions
#

# This is not the full list of Druid extensions, but common ones that people often use. You may need to change this list
# based on your particular setup.
#使用 "mysql-metadata-storage" 作为metadata的存储
#使用 "druid-hdfs-storage" 作为 deep storage
#使用 "druid-parquet-extensions" 向druid中插入parquet数据
druid.extensions.loadList=["druid-kafka-eight", "druid-histogram", "druid-datasketches",  "mysql-metadata-storage", "druid-hdfs-storage", "druid-avro-extensions", "druid-parquet-extensions"]

# If you have a different version of Hadoop, place your Hadoop client jar files in your hadoop-dependencies directory
# and uncomment the line below to point to your directory.
#druid.extensions.hadoopDependenciesDir=/my/dir/hadoop-dependencies

#
# Logging
#

# Log all runtime properties on startup. Disable to avoid logging properties on startup:
druid.startup.logging.logProperties=true

#
# Zookeeper
#

druid.zk.service.host=tagtic-slave01:2181,tagtic-slave02:2181,tagtic-slave03:2181
druid.zk.paths.base=/druid

#
# Metadata storage
#

# For MySQL:
druid.metadata.storage.type=mysql
druid.metadata.storage.connector.connectURI=jdbc:mysql://tagtic-master:3306/druid
druid.metadata.storage.connector.user=root
druid.metadata.storage.connector.password=123456

#
# Deep storage
#

# For HDFS (make sure to include the HDFS extension and that your Hadoop config files in the cp):
druid.storage.type=hdfs
druid.storage.storageDirectory=/druid/segments

#
# Indexing service logs
#

# For HDFS (make sure to include the HDFS extension and that your Hadoop config files in the cp):
druid.indexer.logs.type=hdfs
druid.indexer.logs.directory=/druid/indexing-logs

#
# Service discovery
#

druid.selectors.indexing.serviceName=druid/overlord
druid.selectors.coordinator.serviceName=druid/coordinator

#
# Monitoring
#

druid.monitoring.monitors=["com.metamx.metrics.JvmMonitor"]
druid.emitter=logging
druid.emitter.logging.logLevel=info

将 Hadoop 的配置文件(core-site.xml, hdfs-site.xml, yarn-site.xml, mapred-site.xml) cp 到 conf/druid/_common 目录下

修改 conf/druid/middleManager/runtime.properties 文件.

druid.service=druid/middleManager
druid.port=18091

# Number of tasks per middleManager
druid.worker.capacity=3

# Task launch parameters
# **CDH版本添加 -Dhadoop.mapreduce.job.classloader=true 来解决hadoop indexer导入时jar包冲突问题**
druid.indexer.runner.javaOpts=-server -Xmx2g -Duser.timezone=UTC -Dfile.encoding=UTF-8 -Djava.util.logging.manager=org.apache.logging.log4j.jul.LogManager -Dhadoop.mapreduce.job.classloader=true

druid.indexer.task.baseTaskDir=var/druid/task

# HTTP server threads
druid.server.http.numThreads=25

# Processing threads and buffers
druid.processing.buffer.sizeBytes=536870912
druid.processing.numThreads=2

# Hadoop indexing
druid.indexer.task.hadoopWorkingPath=/tmp/druid-indexing
druid.indexer.task.defaultHadoopCoordinates=["org.apache.hadoop:hadoop-client:2.6.0"]

Hadoop集群版本必须和Druid.io中版本同一,可以通过pull-deps下载相同hadoop-dependencies版本,e.g. : 
java -classpath "lib/*" io.druid.cli.Main tools pull-deps --defaultVersion 0.9.1.1 -c io.druid.extensions:mysql-metadata-storage:0.9.1.1 -c druid-hdfs-storage -h org.apache.hadoop:hadoop-client:2.6.0

项目中Druid.io配置端口号

  • druid.service=druid/coordinator druid.port=18081
  • druid.service=druid/broker druid.port=18082
  • druid.service=druid/historical druid.port=18083
  • druid.service=druid/overlord druid.port=18090
  • druid.service=druid/middleManager druid.port=18091

2.启动Druid.io

java `cat conf/druid/coordinator/jvm.config | xargs` -cp conf/druid/_common:conf/druid/coordinator:lib/* io.druid.cli.Main server coordinator &>> logs/coordinator.log &

java `cat conf/druid/overlord/jvm.config | xargs` -cp conf/druid/_common:conf/druid/overlord:lib/* io.druid.cli.Main server overlord &>> logs/overlord.log &

java `cat conf/druid/historical/jvm.config | xargs` -cp conf/druid/_common:conf/druid/historical:lib/* io.druid.cli.Main server historical &>> logs/historical.log &

java `cat conf/druid/middleManager/jvm.config | xargs` -cp conf/druid/_common:conf/druid/middleManager:lib/* io.druid.cli.Main server middleManager &>> logs/middleManager.log &

java `cat conf/druid/broker/jvm.config | xargs` -cp conf/druid/_common:conf/druid/broker:lib/* io.druid.cli.Main server broker &>> logs/broker.log &

3.从HDFS导入数据到Druid.io

批量导入Batch Data Ingestion 
导入Parquet文件

Druid作业查看 Coordinator : http://tagtic-master:18090/console.html 
Druid集群查看Cluster : http://tagtic-master:18081/#/

解决传入数据时区问题 Hadoop Configuration,在conf/druid/_common/mapred-site.xml中添加

<property>
    <name>mapreduce.map.java.opts</name>
    <value>-server -Xmx1536m -Duser.timezone=UTC -Dfile.encoding=UTF-8 -XX:+PrintGCDetails -XX:+PrintGCTimeStamps</value>
</property>
<property>
    <name>mapreduce.reduce.java.opts</name>
    <value>-server -Xmx2560m -Duser.timezone=UTC -Dfile.encoding=UTF-8 -XX:+PrintGCDetails -XX:+PrintGCTimeStamps</value>
</property>

 

相关文章

 

相关标签/搜索

程序 [csp] 注册了JDBC驱动程序 [com.alibaba.druid.proxy.DruidDriver],但在Web应用程序停止时无法注销它。 为防止内存泄漏,JDBC驱动程序已被强制取消注册。 30-Jun-2025 10:41:03.125 警告 [main] org.apache.catalina.loader.WebappClassLoaderBase.clearReferencesJdbc Web应用程序 [csp] 注册了JDBC驱动程序 [com.mysql.cj.jdbc.Driver],但在Web应用程序停止时无法注销它。 为防止内存泄漏,JDBC驱动程序已被强制取消注册。 30-Jun-2025 10:41:03.126 警告 [main] org.apache.catalina.loader.WebappClassLoaderBase.clearReferencesThreads Web应用程序[csp]似乎启动了一个名为[mysql-cj-abandoned-connection-cleanup]的线程,但未能停止它。这很可能会造成内存泄漏。线程的堆栈跟踪:[ java.lang.Object.wait(Native Method) java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:143) com.mysql.cj.jdbc.AbandonedConnectionCleanupThread.run(AbandonedConnectionCleanupThread.java:91) java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) java.lang.Thread.run(Thread.java:745)] 30-Jun-2025 10:41:08.216 信息 [main] org.apache.catalina.startup.VersionLoggerListener.log Server.服务器版本: Apache Tomcat/9.0.106 30-Jun-2025 10:41:08.218 信息 [main] org.apache.catalina.startup.VersionLoggerListener.log 服务器构建: Jun 5 2025 19:02:30 UTC 30-Jun-2025 10:41:08.218 信息 [main] org.apache.catalina.startup.VersionLoggerListener.log 服务器版本号: 9.0.106.0 30-Jun-2025 10:41:08.218 信息 [main] org.apache.catalina.startup.VersionLoggerListener.log 操作系统名称: Linux 30-Jun-2025 10:41:08.219 信息 [main] org.apache.catalina.startup.VersionLoggerListener.log OS.版本: 3.10.0-862.el7.x86_64 30-Jun-2025 10:41:08.219 信息 [main] org.apache.catalina.startup.VersionLoggerListener.log 架构: amd64 30-Jun-2025 10:41:08.219 信息 [main] org.apache.catalina.startup.VersionLoggerListener.log Java 环境变量: /usr/local/jdk/jre 30-Jun-2025 10:41:08.219 信息 [main] org.apache.catalina.startup.VersionLoggerListener.log Java虚拟机版本: 1.8.0_44-b02 30-Jun-2025 10:41:08.219 信息 [main] org.apache.catalina.startup.VersionLoggerListener.log JVM.供应商: Oracle Corporation 30-Jun-2025 10:41:08.219 信息 [main] org.apache.catalina.startup.VersionLoggerListener.log CATALINA_BASE: /home/csp/apache-tomcat-9.0.106 30-Jun-2025 10:41:08.219 信息 [main] org.apache.catalina.startup.VersionLoggerListener.log CATALINA_HOME: /home/csp/apache-tomcat-9.0.106 30-Jun-2025 10:41:08.219 信息 [main] org.apache.catalina.startup.VersionLoggerListener.log 命令行参数: -Djava.util.logging.config.file=/home/csp/apache-tomcat-9.0.106/conf/logging.properties 30-Jun-2025 10:41:08.220 信息 [main] org.apache.catalina.startup.VersionLoggerListener.log 命令行参数: -Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager 30-Jun-2025 10:41:08.220 信息 [main] org.apache.catalina.startup.VersionLoggerListener.log 命令行参数: -Djdk.tls.ephemeralDHKeySize=2048 30-Jun-2025 10:41:08.220 信息 [main] org.apache.catalina.startup.VersionLoggerListener.log 命令行参数: -Djava.protocol.handler.pkgs=org.apache.catalina.webresources 30-Jun-2025 10:41:08.220 信息 [main] org.apache.catalina.startup.VersionLoggerListener.log 命令行参数: -Dsun.io.useCanonCaches=false 30-Jun-2025 10:41:08.220 信息 [main] org.apache.catalina.startup.VersionLoggerListener.log 命令行参数: -Dorg.apache.catalina.security.SecurityListener.UMASK=0027 30-Jun-2025 10:41:08.220 信息 [main] org.apache.catalina.startup.VersionLoggerListener.log 命令行参数: -Dignore.endorsed.dirs= 30-Jun-2025 10:41:08.220 信息 [main] org.apache.catalina.startup.VersionLoggerListener.log 命令行参数: -Dcatalina.base=/home/csp/apache-tomcat-9.0.106 30-Jun-2025 10:41:08.220 信息 [main] org.apache.catalina.startup.VersionLoggerListener.log 命令行参数: -Dcatalina.home=/home/csp/apache-tomcat-9.0.106 30-Jun-2025 10:41:08.220 信息 [main] org.apache.catalina.startup.VersionLoggerListener.log 命令行参数: -Djava.io.tmpdir=/home/csp/apache-tomcat-9.0.106/temp 30-Jun-2025 10:41:08.221 信息 [main] org.apache.catalina.core.AprLifecycleListener.lifecycleEvent 在java.library.path:[/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib]上找不到基于APR的Apache Tomcat本机库,该库允许在生产环境中获得最佳性能 30-Jun-2025 10:41:08.401 信息 [main] org.apache.coyote.AbstractProtocol.init 初始化协议处理器 [&quot;http-nio-8080&quot;] 30-Jun-2025 10:41:08.419 信息 [main] org.apache.catalina.startup.Catalina.load 服务器在[340]毫秒内初始化 30-Jun-2025 10:41:08.440 信息 [main] org.apache.catalina.core.StandardService.startInternal 正在启动服务[Catalina] 30-Jun-2025 10:41:08.440 信息 [main] org.apache.catalina.core.StandardEngine.startInternal 正在启动 Servlet 引擎:[Apache Tomcat/9.0.106] 30-Jun-2025 10:41:08.453 信息 [main] org.apache.catalina.startup.HostConfig.deployWAR 正在部署web应用程序存档文件[/home/csp/apache-tomcat-9.0.106/webapps/csp.war] 30-Jun-2025 10:41:12.531 信息 [main] org.apache.jasper.servlet.TldScanner.scanJars 至少有一个JAR被扫描用于TLD但尚未包含TLD。 为此记录器启用调试日志记录,以获取已扫描但未在其中找到TLD的完整JAR列表。 在扫描期间跳过不需要的JAR可以缩短启动时间和JSP编译时间。 10:41:12,573 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Could NOT find resource [logback-test.xml] 10:41:12,573 |-INFO in ch.qos.logback.classic.LoggerContext[default] - Found resource [logback.xml] at [file:/home/csp/apache-tomcat-9.0.106/webapps/csp/WEB-INF/classes/logback.xml] 10:41:12,574 |-WARN in ch.qos.logback.classic.LoggerContext[default] - Resource [logback.xml] occurs multiple times on the classpath. 10:41:12,574 |-WARN in ch.qos.logback.classic.LoggerContext[default] - Resource [logback.xml] occurs at [jar:file:/home/csp/apache-tomcat-9.0.106/webapps/csp/WEB-INF/lib/client-collect-api-0.3.28.0.jar!/logback.xml] 10:41:12,574 |-WARN in ch.qos.logback.classic.LoggerContext[default] - Resource [logback.xml] occurs at [file:/home/csp/apache-tomcat-9.0.106/webapps/csp/WEB-INF/classes/logback.xml] 10:41:12,608 |-INFO in ch.qos.logback.classic.joran.action.ConfigurationAction - debug attribute not set 10:41:12,614 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - About to instantiate appender of type [ch.qos.logback.core.rolling.RollingFileAppender] 10:41:12,618 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - Naming appender as [FILE] 10:41:12,632 |-INFO in c.q.l.core.rolling.SizeAndTimeBasedRollingPolicy@263577281 - setting totalSizeCap to 200 GB 10:41:12,635 |-INFO in c.q.l.core.rolling.SizeAndTimeBasedRollingPolicy@263577281 - Archive files will be limited to [100 MB] each. 10:41:12,649 |-INFO in c.q.l.core.rolling.SizeAndTimeBasedRollingPolicy@263577281 - Will use gz compression 10:41:12,650 |-INFO in c.q.l.core.rolling.SizeAndTimeBasedRollingPolicy@263577281 - Will use the pattern ./logs/csp-%d{yyyy-MM-dd}.%i.log.tar for the active file 10:41:12,655 |-INFO in ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP@3e81a56b - The date pattern is &#39;yyyy-MM-dd&#39; from file name pattern &#39;./logs/csp-%d{yyyy-MM-dd}.%i.log.tar.gz&#39;. 10:41:12,655 |-INFO in ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP@3e81a56b - Roll-over at midnight. 10:41:12,655 |-INFO in ch.qos.logback.core.rolling.SizeAndTimeBasedFNATP@3e81a56b - Setting initial period to Thu Jun 26 15:58:50 CST 2025 10:41:12,657 |-INFO in ch.qos.logback.core.joran.action.NestedComplexPropertyIA - Assuming default type [ch.qos.logback.classic.encoder.PatternLayoutEncoder] for [encoder] property 10:41:12,672 |-INFO in ch.qos.logback.core.rolling.RollingFileAppender[FILE] - Active log file name: ./logs/csp.log 10:41:12,672 |-INFO in ch.qos.logback.core.rolling.RollingFileAppender[FILE] - File property is set to [./logs/csp.log] 10:41:12,673 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - About to instantiate appender of type [ch.qos.logback.core.ConsoleAppender] 10:41:12,675 |-INFO in ch.qos.logback.core.joran.action.AppenderAction - Naming appender as [STDOUT] 10:41:12,675 |-INFO in ch.qos.logback.core.joran.action.NestedComplexPropertyIA - Assuming default type [ch.qos.logback.classic.encoder.PatternLayoutEncoder] for [encoder] property 10:41:12,677 |-INFO in ch.qos.logback.classic.joran.action.LoggerAction - Setting level of logger [org.springframework] to INFO 10:41:12,677 |-INFO in ch.qos.logback.classic.joran.action.LoggerAction - Setting level of logger [springfox.documentation] to ERROR 10:41:12,677 |-INFO in ch.qos.logback.classic.joran.action.LoggerAction - Setting level of logger [org.quartz] to ERROR 10:41:12,677 |-INFO in ch.qos.logback.classic.joran.action.LoggerAction - Setting level of logger [org.mybatis] to ERROR 10:41:12,677 |-INFO in ch.qos.logback.classic.joran.action.LoggerAction - Setting level of logger [org.pbccrc] to ERROR 10:41:12,677 |-INFO in ch.qos.logback.classic.joran.action.LoggerAction - Setting level of logger [com.sine] to DEBUG 10:41:12,677 |-INFO in ch.qos.logback.classic.joran.action.LoggerAction - Setting level of logger [com.sine.csp.validate.utils] to DEBUG 10:41:12,677 |-INFO in ch.qos.logback.classic.joran.action.LoggerAction - Setting level of logger [com.sine.csp.common.dao] to ERROR 10:41:12,677 |-INFO in ch.qos.logback.classic.joran.action.LoggerAction - Setting level of logger [com.sine.csp.serviceresult.dao] to ERROR 10:41:12,677 |-INFO in ch.qos.logback.classic.joran.action.RootLoggerAction - Setting level of ROOT logger to INFO 10:41:12,677 |-INFO in ch.qos.logback.core.joran.action.AppenderRefAction - Attaching appender named [STDOUT] to Logger[ROOT] 10:41:12,678 |-INFO in ch.qos.logback.classic.joran.action.ConfigurationAction - End of configuration. 10:41:12,685 |-INFO in ch.qos.logback.classic.joran.JoranConfigurator@4bc614fc - Registering current configuration as safe fallback point . ____ _ __ _ _ /\\ / ___&#39;_ __ _ _(_)_ __ __ _ \ \ \ \ ( ( )\___ | &#39;_ | &#39;_| | &#39;_ \/ _` | \ \ \ \ \\/ ___)| |_)| | | | | || (_| | ) ) ) ) &#39; |____| .__|_| |_|_| |_\__, | / / / / =========|_|==============|___/=/_/_/_/ :: Spring Boot :: (v2.7.18) 2025-06-30 10:41:13.285 [background-preinit] INFO org.hibernate.validator.internal.util.Version - HV000001: Hibernate Validator 6.2.5.Final 2025-06-30 10:41:13.307 [main] INFO com.sine.Application - Starting Application v1.0.35-SNAPSHOT using Java 1.8.0_44 on localhost with PID 92930 (/home/csp/apache-tomcat-9.0.106/webapps/csp/WEB-INF/classes started by root in /home/csp/apache-tomcat-9.0.106/bin) 2025-06-30 10:41:13.308 [main] DEBUG com.sine.Application - Running with Spring Boot v2.7.18, Spring v5.3.39 2025-06-30 10:41:13.308 [main] INFO com.sine.Application - The following 1 profile is active: &quot;dev&quot; 2025-06-30 10:41:14.873 [main] INFO org.springframework.data.repository.config.RepositoryConfigurationDelegate - Multiple Spring Data modules found, entering strict repository configuration mode 2025-06-30 10:41:14.877 [main] INFO org.springframework.data.repository.config.RepositoryConfigurationDelegate - Bootstrapping Spring Data Redis repositories in DEFAULT mode. 2025-06-30 10:41:14.945 [main] INFO org.springframework.data.repository.config.RepositoryConfigurationDelegate - Finished Spring Data repository scanning in 51 ms. Found 0 Redis repository interfaces. 2025-06-30 10:41:15.963 [main] INFO org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext - Root WebApplicationContext: initialization completed in 2629 ms 2025-06-30 10:41:16.575 [main] INFO org.springframework.boot.web.servlet.RegistrationBean - Filter xssFilter was not registered (possibly already registered?) 2025-06-30 10:41:17.892 [main] INFO com.alibaba.druid.pool.DruidDataSource - {dataSource-1} inited 2025-06-30 10:41:20.438 [main] INFO org.springframework.scheduling.quartz.LocalDataSourceJobStore - Using db table-based data access locking (synchronization). 2025-06-30 10:41:20.439 [main] INFO org.springframework.scheduling.quartz.LocalDataSourceJobStore - JobStoreCMT initialized. 2025-06-30 10:41:21.526 [main] INFO org.springframework.aop.framework.CglibAopProxy - Unable to proxy interface-implementing method [public final void org.springframework.scheduling.quartz.QuartzJobBean.execute(org.quartz.JobExecutionContext) throws org.quartz.JobExecutionException] because it is marked as final: Consider using interface-based JDK proxies instead! 2025-06-30 10:41:21.528 [main] INFO org.springframework.aop.framework.CglibAopProxy - Unable to proxy interface-implementing method [public final void org.springframework.scheduling.quartz.QuartzJobBean.execute(org.quartz.JobExecutionContext) throws org.quartz.JobExecutionException] because it is marked as final: Consider using interface-based JDK proxies instead! 2025-06-30 10:41:21.531 [main] INFO org.springframework.aop.framework.CglibAopProxy - Unable to proxy interface-implementing method [public final void org.springframework.scheduling.quartz.QuartzJobBean.execute(org.quartz.JobExecutionContext) throws org.quartz.JobExecutionException] because it is marked as final: Consider using interface-based JDK proxies instead! 2025-06-30 10:41:21.829 [main] INFO org.springframework.aop.framework.CglibAopProxy - Unable to proxy interface-implementing method [public final void org.springframework.scheduling.quartz.QuartzJobBean.execute(org.quartz.JobExecutionContext) throws org.quartz.JobExecutionException] because it is marked as final: Consider using interface-based JDK proxies instead! 2025-06-30 10:41:21.839 [main] INFO org.springframework.aop.framework.CglibAopProxy - Unable to proxy interface-implementing method [public final void org.springframework.scheduling.quartz.QuartzJobBean.execute(org.quartz.JobExecutionContext) throws org.quartz.JobExecutionException] because it is marked as final: Consider using interface-based JDK proxies instead! 2025-06-30 10:41:21.843 [main] INFO org.springframework.aop.framework.CglibAopProxy - Unable to proxy interface-implementing method [public final void org.springframework.scheduling.quartz.QuartzJobBean.execute(org.quartz.JobExecutionContext) throws org.quartz.JobExecutionException] because it is marked as final: Consider using interface-based JDK proxies instead! 2025-06-30 10:41:22.696 [main] WARN org.thymeleaf.templatemode.TemplateMode - [THYMELEAF][main] Template Mode &#39;HTML5&#39; is deprecated. Using Template Mode &#39;HTML&#39; instead. 2025-06-30 10:41:23.124 [main] INFO org.springframework.boot.actuate.endpoint.web.EndpointLinksResolver - Exposing 0 endpoint(s) beneath base path &#39;/actuator&#39; 2025-06-30 10:41:23.164 [main] INFO org.springframework.scheduling.quartz.SchedulerFactoryBean - Starting Quartz Scheduler now 2025-06-30 10:41:24.144 [main] INFO com.sine.Application - Started Application in 11.303 seconds (JVM running for 16.22) 2025-06-30 10:41:24.458 [main] INFO org.springframework.boot.autoconfigure.logging.ConditionEvaluationReportLoggingListener - Error starting ApplicationContext. To display the conditions report re-run your application with &#39;debug&#39; enabled. 2025-06-30 10:41:24.486 [main] ERROR org.springframework.boot.SpringApplication - Application run failed java.lang.IllegalStateException: Failed to execute ApplicationRunner at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:759) at org.springframework.boot.SpringApplication.lambda$callRunners$2(SpringApplication.java:746) at org.springframework.boot.SpringApplication$$Lambda$1079/1133797119.accept(Unknown Source) at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184) at java.util.stream.SortedOps$SizedRefSortingSink.end(SortedOps.java:352) at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:513) at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:502) at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:151) at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:174) at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:418) at org.springframework.boot.SpringApplication.callRunners(SpringApplication.java:744) at org.springframework.boot.SpringApplication.run(SpringApplication.java:315) at org.springframework.boot.web.servlet.support.SpringBootServletInitializer.run(SpringBootServletInitializer.java:175) at org.springframework.boot.web.servlet.support.SpringBootServletInitializer.createRootApplicationContext(SpringBootServletInitializer.java:155) at org.springframework.boot.web.servlet.support.SpringBootServletInitializer.onStartup(SpringBootServletInitializer.java:97) at org.springframework.web.SpringServletContainerInitializer.onStartup(SpringServletContainerInitializer.java:174) at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:4491) at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164) at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:599) at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:571) at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:603) at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1013) at org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1861) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:76) at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:112) at org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:817) at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:468) at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1579) at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:312) at org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:109) at org.apache.catalina.util.LifecycleBase.setStateInternal(LifecycleBase.java:389) at org.apache.catalina.util.LifecycleBase.setState(LifecycleBase.java:336) at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:776) at org.apache.catalina.core.StandardHost.startInternal(StandardHost.java:721) at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164) at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1203) at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1193) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:76) at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:134) at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:749) at org.apache.catalina.core.StandardEngine.startInternal(StandardEngine.java:211) at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164) at org.apache.catalina.core.StandardService.startInternal(StandardService.java:412) at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164) at org.apache.catalina.core.StandardServer.startInternal(StandardServer.java:874) at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164) at org.apache.catalina.startup.Catalina.start(Catalina.java:739) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:345) at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:476) Caused by: org.springframework.data.redis.RedisConnectionFailureException: Unable to connect to Redis; nested exception is org.springframework.data.redis.connection.PoolException: Could not get a resource from the pool; nested exception is io.lettuce.core.RedisConnectionException: Unable to connect to 127.0.0.1:6379 at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory$ExceptionTranslatingConnectionProvider.translateException(LettuceConnectionFactory.java:1689) at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory$ExceptionTranslatingConnectionProvider.getConnection(LettuceConnectionFactory.java:1597) at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory$SharedConnection.getNativeConnection(LettuceConnectionFactory.java:1383) at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory$SharedConnection.getConnection(LettuceConnectionFactory.java:1366) at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory.getSharedConnection(LettuceConnectionFactory.java:1093) at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory.getConnection(LettuceConnectionFactory.java:421) at org.springframework.data.redis.core.RedisConnectionUtils.fetchConnection(RedisConnectionUtils.java:193) at org.springframework.data.redis.core.RedisConnectionUtils.doGetConnection(RedisConnectionUtils.java:144) at org.springframework.data.redis.core.RedisConnectionUtils.getConnection(RedisConnectionUtils.java:105) at org.springframework.data.redis.core.RedisTemplate.execute(RedisTemplate.java:211) at org.springframework.data.redis.core.RedisTemplate.execute(RedisTemplate.java:191) at org.springframework.data.redis.core.RedisTemplate.keys(RedisTemplate.java:896) at com.sine.csp.common.boot.RedisCleanService.run(RedisCleanService.java:27) at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:756) ... 56 common frames omitted Caused by: org.springframework.data.redis.connection.PoolException: Could not get a resource from the pool; nested exception is io.lettuce.core.RedisConnectionException: Unable to connect to 127.0.0.1:6379 at org.springframework.data.redis.connection.lettuce.LettucePoolingConnectionProvider.getConnection(LettucePoolingConnectionProvider.java:109) at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory$ExceptionTranslatingConnectionProvider.getConnection(LettuceConnectionFactory.java:1595) ... 68 common frames omitted Caused by: io.lettuce.core.RedisConnectionException: Unable to connect to 127.0.0.1:6379 at io.lettuce.core.RedisConnectionException.create(RedisConnectionException.java:78) at io.lettuce.core.RedisConnectionException.create(RedisConnectionException.java:56) at io.lettuce.core.AbstractRedisClient.getConnection(AbstractRedisClient.java:330) at io.lettuce.core.RedisClient.connect(RedisClient.java:216) at org.springframework.data.redis.connection.lettuce.StandaloneConnectionProvider.lambda$getConnection$1(StandaloneConnectionProvider.java:115) at org.springframework.data.redis.connection.lettuce.StandaloneConnectionProvider$$Lambda$1084/1822563181.get(Unknown Source) at java.util.Optional.orElseGet(Optional.java:267) at org.springframework.data.redis.connection.lettuce.StandaloneConnectionProvider.getConnection(StandaloneConnectionProvider.java:115) at org.springframework.data.redis.connection.lettuce.LettucePoolingConnectionProvider.lambda$null$0(LettucePoolingConnectionProvider.java:97) at org.springframework.data.redis.connection.lettuce.LettucePoolingConnectionProvider$$Lambda$1082/1664936139.get(Unknown Source) at io.lettuce.core.support.ConnectionPoolSupport$RedisPooledObjectFactory.create(ConnectionPoolSupport.java:211) at io.lettuce.core.support.ConnectionPoolSupport$RedisPooledObjectFactory.create(ConnectionPoolSupport.java:201) at org.apache.commons.pool2.BasePooledObjectFactory.makeObject(BasePooledObjectFactory.java:70) at org.apache.commons.pool2.impl.GenericObjectPool.create(GenericObjectPool.java:571) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:298) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:223) at io.lettuce.core.support.ConnectionPoolSupport$1.borrowObject(ConnectionPoolSupport.java:122) at io.lettuce.core.support.ConnectionPoolSupport$1.borrowObject(ConnectionPoolSupport.java:117) at org.springframework.data.redis.connection.lettuce.LettucePoolingConnectionProvider.getConnection(LettucePoolingConnectionProvider.java:103) ... 69 common frames omitted Caused by: io.lettuce.core.RedisCommandExecutionException: NOAUTH HELLO must be called with the client already authenticated, otherwise the HELLO AUTH &lt;user&gt; &lt;pass&gt; option can be used to authenticate the client and select the RESP protocol version at the same time at io.lettuce.core.internal.ExceptionFactory.createExecutionException(ExceptionFactory.java:147) at io.lettuce.core.internal.ExceptionFactory.createExecutionException(ExceptionFactory.java:116) at io.lettuce.core.protocol.AsyncCommand.completeResult(AsyncCommand.java:120) at io.lettuce.core.protocol.AsyncCommand.complete(AsyncCommand.java:111) at io.lettuce.core.protocol.CommandWrapper.complete(CommandWrapper.java:63) at io.lettuce.core.protocol.CommandHandler.complete(CommandHandler.java:747) at io.lettuce.core.protocol.CommandHandler.decode(CommandHandler.java:682) at io.lettuce.core.protocol.CommandHandler.channelRead(CommandHandler.java:599) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919) at io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:800) at io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:509) at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:407) at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) at java.lang.Thread.run(Thread.java:745) 2025-06-30 10:41:24.514 [main] INFO org.springframework.scheduling.quartz.SchedulerFactoryBean - Shutting down Quartz Scheduler 2025-06-30 10:41:25.601 [main] INFO com.alibaba.druid.pool.DruidDataSource - {dataSource-1} closing ... 2025-06-30 10:41:25.616 [main] INFO com.alibaba.druid.pool.DruidDataSource - {dataSource-1} closed 30-Jun-2025 10:41:25.621 严重 [main] org.apache.catalina.startup.HostConfig.deployWAR 部署 Web 应用程序 archive [/home/csp/apache-tomcat-9.0.106/webapps/csp.war] 时出错 java.lang.IllegalStateException: 启动子级时出错 at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:602) at org.apache.catalina.core.ContainerBase.addChild(ContainerBase.java:571) at org.apache.catalina.core.StandardHost.addChild(StandardHost.java:603) at org.apache.catalina.startup.HostConfig.deployWAR(HostConfig.java:1013) at org.apache.catalina.startup.HostConfig$DeployWar.run(HostConfig.java:1861) at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:76) at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:112) at org.apache.catalina.startup.HostConfig.deployWARs(HostConfig.java:817) at org.apache.catalina.startup.HostConfig.deployApps(HostConfig.java:468) at org.apache.catalina.startup.HostConfig.start(HostConfig.java:1579) at org.apache.catalina.startup.HostConfig.lifecycleEvent(HostConfig.java:312) at org.apache.catalina.util.LifecycleBase.fireLifecycleEvent(LifecycleBase.java:109) at org.apache.catalina.util.LifecycleBase.setStateInternal(LifecycleBase.java:389) at org.apache.catalina.util.LifecycleBase.setState(LifecycleBase.java:336) at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:776) at org.apache.catalina.core.StandardHost.startInternal(StandardHost.java:721) at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164) at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1203) at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1193) at java.util.concurrent.FutureTask.run(FutureTask.java:266) at org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:76) at java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:134) at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:749) at org.apache.catalina.core.StandardEngine.startInternal(StandardEngine.java:211) at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164) at org.apache.catalina.core.StandardService.startInternal(StandardService.java:412) at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164) at org.apache.catalina.core.StandardServer.startInternal(StandardServer.java:874) at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164) at org.apache.catalina.startup.Catalina.start(Catalina.java:739) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:345) at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:476) Caused by: org.apache.catalina.LifecycleException: 无法启动组件[StandardEngine[Catalina].StandardHost[localhost].StandardContext[/csp]] at org.apache.catalina.util.LifecycleBase.handleSubClassException(LifecycleBase.java:406) at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:179) at org.apache.catalina.core.ContainerBase.addChildInternal(ContainerBase.java:599) ... 37 more Caused by: java.lang.IllegalStateException: Failed to execute ApplicationRunner at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:759) at org.springframework.boot.SpringApplication.lambda$callRunners$2(SpringApplication.java:746) at org.springframework.boot.SpringApplication$$Lambda$1079/1133797119.accept(Unknown Source) at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184) at java.util.stream.SortedOps$SizedRefSortingSink.end(SortedOps.java:352) at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:513) at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:502) at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:151) at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:174) at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:418) at org.springframework.boot.SpringApplication.callRunners(SpringApplication.java:744) at org.springframework.boot.SpringApplication.run(SpringApplication.java:315) at org.springframework.boot.web.servlet.support.SpringBootServletInitializer.run(SpringBootServletInitializer.java:175) at org.springframework.boot.web.servlet.support.SpringBootServletInitializer.createRootApplicationContext(SpringBootServletInitializer.java:155) at org.springframework.boot.web.servlet.support.SpringBootServletInitializer.onStartup(SpringBootServletInitializer.java:97) at org.springframework.web.SpringServletContainerInitializer.onStartup(SpringServletContainerInitializer.java:174) at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:4491) at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:164) ... 38 more Caused by: org.springframework.data.redis.RedisConnectionFailureException: Unable to connect to Redis; nested exception is org.springframework.data.redis.connection.PoolException: Could not get a resource from the pool; nested exception is io.lettuce.core.RedisConnectionException: Unable to connect to 127.0.0.1:6379 at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory$ExceptionTranslatingConnectionProvider.translateException(LettuceConnectionFactory.java:1689) at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory$ExceptionTranslatingConnectionProvider.getConnection(LettuceConnectionFactory.java:1597) at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory$SharedConnection.getNativeConnection(LettuceConnectionFactory.java:1383) at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory$SharedConnection.getConnection(LettuceConnectionFactory.java:1366) at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory.getSharedConnection(LettuceConnectionFactory.java:1093) at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory.getConnection(LettuceConnectionFactory.java:421) at org.springframework.data.redis.core.RedisConnectionUtils.fetchConnection(RedisConnectionUtils.java:193) at org.springframework.data.redis.core.RedisConnectionUtils.doGetConnection(RedisConnectionUtils.java:144) at org.springframework.data.redis.core.RedisConnectionUtils.getConnection(RedisConnectionUtils.java:105) at org.springframework.data.redis.core.RedisTemplate.execute(RedisTemplate.java:211) at org.springframework.data.redis.core.RedisTemplate.execute(RedisTemplate.java:191) at org.springframework.data.redis.core.RedisTemplate.keys(RedisTemplate.java:896) at com.sine.csp.common.boot.RedisCleanService.run(RedisCleanService.java:27) at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:756) ... 56 more Caused by: org.springframework.data.redis.connection.PoolException: Could not get a resource from the pool; nested exception is io.lettuce.core.RedisConnectionException: Unable to connect to 127.0.0.1:6379 at org.springframework.data.redis.connection.lettuce.LettucePoolingConnectionProvider.getConnection(LettucePoolingConnectionProvider.java:109) at org.springframework.data.redis.connection.lettuce.LettuceConnectionFactory$ExceptionTranslatingConnectionProvider.getConnection(LettuceConnectionFactory.java:1595) ... 68 more Caused by: io.lettuce.core.RedisConnectionException: Unable to connect to 127.0.0.1:6379 at io.lettuce.core.RedisConnectionException.create(RedisConnectionException.java:78) at io.lettuce.core.RedisConnectionException.create(RedisConnectionException.java:56) at io.lettuce.core.AbstractRedisClient.getConnection(AbstractRedisClient.java:330) at io.lettuce.core.RedisClient.connect(RedisClient.java:216) at org.springframework.data.redis.connection.lettuce.StandaloneConnectionProvider.lambda$getConnection$1(StandaloneConnectionProvider.java:115) at org.springframework.data.redis.connection.lettuce.StandaloneConnectionProvider$$Lambda$1084/1822563181.get(Unknown Source) at java.util.Optional.orElseGet(Optional.java:267) at org.springframework.data.redis.connection.lettuce.StandaloneConnectionProvider.getConnection(StandaloneConnectionProvider.java:115) at org.springframework.data.redis.connection.lettuce.LettucePoolingConnectionProvider.lambda$null$0(LettucePoolingConnectionProvider.java:97) at org.springframework.data.redis.connection.lettuce.LettucePoolingConnectionProvider$$Lambda$1082/1664936139.get(Unknown Source) at io.lettuce.core.support.ConnectionPoolSupport$RedisPooledObjectFactory.create(ConnectionPoolSupport.java:211) at io.lettuce.core.support.ConnectionPoolSupport$RedisPooledObjectFactory.create(ConnectionPoolSupport.java:201) at org.apache.commons.pool2.BasePooledObjectFactory.makeObject(BasePooledObjectFactory.java:70) at org.apache.commons.pool2.impl.GenericObjectPool.create(GenericObjectPool.java:571) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:298) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:223) at io.lettuce.core.support.ConnectionPoolSupport$1.borrowObject(ConnectionPoolSupport.java:122) at io.lettuce.core.support.ConnectionPoolSupport$1.borrowObject(ConnectionPoolSupport.java:117) at org.springframework.data.redis.connection.lettuce.LettucePoolingConnectionProvider.getConnection(LettucePoolingConnectionProvider.java:103) ... 69 more Caused by: io.lettuce.core.RedisCommandExecutionException: NOAUTH HELLO must be called with the client already authenticated, otherwise the HELLO AUTH &lt;user&gt; &lt;pass&gt; option can be used to authenticate the client and select the RESP protocol version at the same time at io.lettuce.core.internal.ExceptionFactory.createExecutionException(ExceptionFactory.java:147) at io.lettuce.core.internal.ExceptionFactory.createExecutionException(ExceptionFactory.java:116) at io.lettuce.core.protocol.AsyncCommand.completeResult(AsyncCommand.java:120) at io.lettuce.core.protocol.AsyncCommand.complete(AsyncCommand.java:111) at io.lettuce.core.protocol.CommandWrapper.complete(CommandWrapper.java:63) at io.lettuce.core.protocol.CommandHandler.complete(CommandHandler.java:747) at io.lettuce.core.protocol.CommandHandler.decode(CommandHandler.java:682) at io.lettuce.core.protocol.CommandHandler.channelRead(CommandHandler.java:599) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:442) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440) at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919) at io.netty.channel.epoll.AbstractEpollStreamChannel$EpollStreamUnsafe.epollInReady(AbstractEpollStreamChannel.java:800) at io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:509) at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:407) at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) at java.lang.Thread.run(Thread.java:745) 30-Jun-2025 10:41:25.637 信息 [main] org.apache.catalina.startup.HostConfig.deployWAR web应用程序存档文件[/home/csp/apache-tomcat-9.0.106/webapps/csp.war]的部署已在[17,181]ms内完成 30-Jun-2025 10:41:25.637 信息 [main] org.apache.catalina.startup.HostConfig.deployDirectory 把web 应用程序部署到目录 [/home/csp/apache-tomcat-9.0.106/webapps/ROOT] 30-Jun-2025 10:41:25.659 信息 [main] org.apache.catalina.startup.HostConfig.deployDirectory Web应用程序目录[/home/csp/apache-tomcat-9.0.106/webapps/ROOT]的部署已在[22]毫秒内完成 30-Jun-2025 10:41:25.659 信息 [main] org.apache.catalina.startup.HostConfig.deployDirectory 把web 应用程序部署到目录 [/home/csp/apache-tomcat-9.0.106/webapps/docs] 30-Jun-2025 10:41:25.672 信息 [main] org.apache.catalina.startup.HostConfig.deployDirectory Web应用程序目录[/home/csp/apache-tomcat-9.0.106/webapps/docs]的部署已在[12]毫秒内完成 30-Jun-2025 10:41:25.672 信息 [main] org.apache.catalina.startup.HostConfig.deployDirectory 把web 应用程序部署到目录 [/home/csp/apache-tomcat-9.0.106/webapps/examples] 30-Jun-2025 10:41:25.770 信息 [main] org.apache.catalina.startup.HostConfig.deployDirectory Web应用程序目录[/home/csp/apache-tomcat-9.0.106/webapps/examples]的部署已在[97]毫秒内完成 30-Jun-2025 10:41:25.770 信息 [main] org.apache.catalina.startup.HostConfig.deployDirectory 把web 应用程序部署到目录 [/home/csp/apache-tomcat-9.0.106/webapps/host-manager] 30-Jun-2025 10:41:25.782 信息 [main] org.apache.catalina.startup.HostConfig.deployDirectory Web应用程序目录[/home/csp/apache-tomcat-9.0.106/webapps/host-manager]的部署已在[12]毫秒内完成 30-Jun-2025 10:41:25.783 信息 [main] org.apache.catalina.startup.HostConfig.deployDirectory 把web 应用程序部署到目录 [/home/csp/apache-tomcat-9.0.106/webapps/manager] 30-Jun-2025 10:41:25.805 信息 [main] org.apache.catalina.startup.HostConfig.deployDirectory Web应用程序目录[/home/csp/apache-tomcat-9.0.106/webapps/manager]的部署已在[23]毫秒内完成 30-Jun-2025 10:41:25.809 信息 [main] org.apache.coyote.AbstractProtocol.start 开始协议处理句柄[&quot;http-nio-8080&quot;] 30-Jun-2025 10:41:25.836 信息 [main] org.apache.catalina.startup.Catalina.start [17417]毫秒后服务器启动 30-Jun-2025 10:41:29.190 警告 [http-nio-8080-exec-3] org.apache.catalina.loader.WebappClassLoaderBase.clearReferencesJdbc Web应用程序 [csp] 注册了JDBC驱动程序 [com.alibaba.druid.proxy.DruidDriver],但在Web应用程序停止时无法注销它。 为防止内存泄漏,JDBC驱动程序已被强制取消注册。 30-Jun-2025 10:41:29.191 警告 [http-nio-8080-exec-3] org.apache.catalina.loader.WebappClassLoaderBase.clearReferencesJdbc Web应用程序 [csp] 注册了JDBC驱动程序 [com.mysql.cj.jdbc.Driver],但在Web应用程序停止时无法注销它。 为防止内存泄漏,JDBC驱动程序已被强制取消注册。 30-Jun-2025 10:41:29.193 警告 [http-nio-8080-exec-3] org.apache.catalina.loader.WebappClassLoaderBase.clearReferencesThreads Web应用程序[csp]似乎启动了一个名为[mysql-cj-abandoned-connection-cleanup]的线程,但未能停止它。这很可能会造成内存泄漏。线程的堆栈跟踪:[ java.lang.Object.wait(Native Method) java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:143) com.mysql.cj.jdbc.AbandonedConnectionCleanupThread.run(AbandonedConnectionCleanupThread.java:91) java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) java.lang.Thread.run(Thread.java:745)] 30-Jun-2025 10:41:29.194 严重 [http-nio-8080-exec-3] org.apache.catalina.loader.WebappClassLoaderBase.checkThreadLocalMapForLeaks web应用程序[csp]创建了一个ThreadLocal,其键类型为[org.springframework.boot.SpringBootExceptionHandler.LoggedExceptionHandlerThreadLocal](值为[org.springframework.boot.SpringBootExceptionHandler$LoggedExceptionHandlerThreadLocal@284f8fa0]),值类型为[org.springframework.boot.SpringBootExceptionHandler](值为[org.springframework.boot.SpringBootExceptionHandler@4a632f7c),但在停止web应用程序时未能将其删除。线程将随着时间的推移而更新,以尝试避免可能的内存泄漏 30-Jun-2025 10:41:29.194 严重 [http-nio-8080-exec-3] org.apache.catalina.loader.WebappClassLoaderBase.checkThreadLocalMapForLeaks web应用程序[csp]创建了一个ThreadLocal,其键类型为[java.lang.ThreadLocal](值为[java.lang.ThreadLocal@40ba36e7]),值类型为[io.netty.util.internal.InternalThreadLocalMap](值为[io.netty.util.internal.InternalThreadLocalMap@6ad179a0),但在停止web应用程序时未能将其删除。线程将随着时间的推移而更新,以尝试避免可能的内存泄漏 30-Jun-2025 10:41:29.995 信息 [mysql-cj-abandoned-connection-cleanup] org.apache.catalina.loader.WebappClassLoaderBase.checkStateForResourceLoading 非法访问:此Web应用程序实例已停止。无法加载[]。为了调试以及终止导致非法访问的线程,将抛出以下堆栈跟踪。 java.lang.IllegalStateException: 非法访问:此Web应用程序实例已停止。无法加载[]。为了调试以及终止导致非法访问的线程,将抛出以下堆栈跟踪。 at org.apache.catalina.loader.WebappClassLoaderBase.checkStateForResourceLoading(WebappClassLoaderBase.java:1374) at org.apache.catalina.loader.WebappClassLoaderBase.getResource(WebappClassLoaderBase.java:997) at com.mysql.cj.jdbc.AbandonedConnectionCleanupThread.checkThreadContextClassLoader(AbandonedConnectionCleanupThread.java:123) at com.mysql.cj.jdbc.AbandonedConnectionCleanupThread.run(AbandonedConnectionCleanupThread.java:90) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) 30-Jun-2025 10:41:32.509 信息 [http-nio-8080-exec-3] org.apache.jasper.servlet.TldScanner.scanJars 至少有一个JAR被扫描用于TLD但尚未包含TLD。 为此记录器启用调试日志记录,以获取已扫描但未在其中找到TLD的完整JAR列表。 在扫描期间跳过不需要的JAR可以缩短启动时间和JSP编译时间。
最新发布
07-01
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值