Caught exception while loading file struts-default.xml 的错误

MyEclipse 6开发JDK6和Struts 2冲突的问题真实原因及解决办法
用Struts 2+Hibernate开发. 再次遇到了 Struts 2+JDK 6的冲突问题.

以前在正式环境上发布系统的时候都是直接将tomcat进程杀掉然后重启,这样虽然能够保证不出问题,但是如果tomcat容器发布了多个系统就不行,应该这样会导致其他系统也停止,所以后来就通过tomcat后台管理来发布, 这样虽然能发布上去,但是启动不了,报如下错误:
严重: Exception starting filter struts2 

Caught exception while loading file struts-default.xml - [unknown location]

at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.loadConfigurationFiles(XmlConfigurationProvider.java:839)

at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.loadDocuments(XmlConfigurationProvider.java:131)

at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.init(XmlConfigurationProvider.java:100)

at com.opensymphony.xwork2.config.impl.DefaultConfiguration.reload(DefaultConfiguration.java:130)

at com.opensymphony.xwork2.config.ConfigurationManager.getConfiguration(ConfigurationManager.java:52)

at org.apache.struts2.dispatcher.Dispatcher.init_PreloadConfiguration(Dispatcher.java:395)

at org.apache.struts2.dispatcher.Dispatcher.init(Dispatcher.java:452)

at org.apache.struts2.dispatcher.FilterDispatcher.init(FilterDispatcher.java:201)

at org.apache.catalina.core.ApplicationFilterConfig.getFilter(ApplicationFilterConfig.java:275)

at org.apache.catalina.core.ApplicationFilterConfig.setFilterDef(ApplicationFilterConfig.java:397)

at org.apache.catalina.core.ApplicationFilterConfig.<init>(ApplicationFilterConfig.java:108)

at org.apache.catalina.core.StandardContext.filterStart(StandardContext.java:3696)

at org.apache.catalina.core.StandardContext.start(StandardContext.java:4343)

at org.apache.catalina.core.StandardContext.reload(StandardContext.java:3086)

at org.apache.catalina.manager.ManagerServlet.reload(ManagerServlet.java:912)

at org.apache.catalina.manager.HTMLManagerServlet.reload(HTMLManagerServlet.java:523)

at org.apache.catalina.manager.HTMLManagerServlet.doGet(HTMLManagerServlet.java:113)

at javax.servlet.http.HttpServlet.service(HttpServlet.java:690)

at javax.servlet.http.HttpServlet.service(HttpServlet.java:803)

at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:290)

at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:206)

at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:233)

at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:175)

at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:525)

at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:128)

at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102)

at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:109)

at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:263)

at org.apache.coyote.http11.Http11AprProcessor.process(Http11AprProcessor.java:852)

at org.apache.coyote.http11.Http11AprProtocol$Http11ConnectionHandler.process(Http11AprProtocol.java:584)

at org.apache.tomcat.util.net.AprEndpoint$Worker.run(AprEndpoint.java:1508)

at java.lang.Thread.run(Thread.java:619)

Caused by: java.lang.ClassCastException: org.apache.xerces.parsers.XML11Configuration cannot be cast to org.apache.xerces.xni.parser.XMLParserConfiguration

at org.apache.xerces.parsers.DOMParser.<init>(Unknown Source)

at org.apache.xerces.parsers.DOMParser.<init>(Unknown Source)

at org.apache.xerces.jaxp.DocumentBuilderImpl.<init>(Unknown Source)

at org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown Source)

at com.sun.org.apache.xalan.internal.xsltc.trax.SAX2DOM.<init>(SAX2DOM.java:69)

at com.sun.org.apache.xalan.internal.xsltc.runtime.output.TransletOutputHandlerFactory.getSerializationHandler(TransletOutputHandlerFactory.java:187)

at com.sun.org.apache.xalan.internal.xsltc.trax.TransformerImpl.getOutputHandler(TransformerImpl.java:392)

at com.sun.org.apache.xalan.internal.xsltc.trax.TransformerHandlerImpl.setResult(TransformerHandlerImpl.java:137)

at com.opensymphony.xwork2.util.DomHelper$DOMBuilder.setup(DomHelper.java:213)

at com.opensymphony.xwork2.util.DomHelper$DOMBuilder.<init>(DomHelper.java:198)

at com.opensymphony.xwork2.util.DomHelper$DOMBuilder.<init>(DomHelper.java:189)

at com.opensymphony.xwork2.util.DomHelper$DOMBuilder.<init>(DomHelper.java:175)

at com.opensymphony.xwork2.util.DomHelper.parse(DomHelper.java:115)

at com.opensymphony.xwork2.config.providers.XmlConfigurationProvider.loadConfigurationFiles(XmlConfigurationProvider.java:830)

... 31 more

2008-9-19 0:08:34 org.apache.catalina.core.StandardContext start


第一次启动能运行, 在http://localhost:8080/manager/html中reload或start就不行.

那基本可断定是其他jar包有冲突, 应该是XML解析包有冲突.

检查发布后的WEB-INF/lib有两个XML解析包: xml-apis.jar和xerces-2.6.2.jar

这种错误真正原因不是JDK 6和Struts 2冲突, 而是 MyEclipse Hibernate 类库中多了两个包: xml-apis.jar和xerces-2.6.2.jar, 这两个包的功能和JDK的冲突了. 解决办法: 1. 删除发布后目录的 WEB-INF/lib/ 下的这两个文件; 2. 或者使用JDK 1.5来启动Tomcat 6.

解决办法
1>删除发布后的目录下的WEB-INF\lib下的xml-apis.jar和xerces-2.6.2.jar.
2>停止Tomcat,重新启动就可以,不需要再次发布
[root@yfw ~]# cd /opt/openfire [root@yfw openfire]# ls -l /opt/openfire/plugins/restapi/ ls: cannot access '/opt/openfire/plugins/restapi/': No such file or directory [root@yfw openfire]# tail -f /opt/openfire/logs/openfire.log at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562) [netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) [netty-common-4.1.108.Final.jar:4.1.108.Final] at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) [netty-common-4.1.108.Final.jar:4.1.108.Final] at java.lang.Thread.run(Thread.java:829) [?:?] 2025.10.03 00:48:37.616 ERROR [socket_c2s-thread-4]: org.jivesoftware.openfire.nio.NettyConnection - Problem during connection close or cleanup io.netty.channel.StacklessClosedChannelException: null at io.netty.channel.AbstractChannel$AbstractUnsafe.write(Object, ChannelPromise)(Unknown Source) ~[netty-transport-4.1.108.Final.jar:4.1.108.Final] 2025.10.03 00:48:51.280 WARN [socket_c2s-thread-5]: org.jivesoftware.openfire.net.StanzaHandler - TLS requested by initiator when TLS was never offered by server. Closing connection: NettyConnection{peer: /64.62.197.2:37526, state: CLOSED, session: LocalClientSession{address=localhost/fd019775-0604-48c3-890b-927d4a9329f2, streamID=a1slsqrfla, status=CLOSED, isEncrypted=false, isDetached=false, serverName='localhost', isInitialized=false, hasAuthToken=false, peer address='64.62.197.2', presence=' <presence type="unavailable"/>'}, Netty channel handler context name: NettyClientConnectionHandler#0} 2025.10.03 01:25:57.988 WARN [PluginMonitorExec-2]: org.jivesoftware.openfire.container.PluginManager - Ignoring plugin 'restapi-openfire-plugin-assembly': requires server version 5.0.0. Current server version is 4.9.2. 2025.10.03 01:29:11.523 WARN [socket_c2s-thread-7]: org.jivesoftware.openfire.nio.NettyXMPPDecoder - Error occurred while decoding XMPP stanza, closing connection: NettyConnection{peer: /35.203.210.191:59128, state: OPEN, session: LocalClientSession{address=localhost/33ef000c-a3c3-4c31-ae17-74d84ffa0605, streamID=597g80vfvc, status=CONNECTED, isEncrypted=false, isDetached=false, serverName='localhost', isInitialized=false, hasAuthToken=false, peer address='35.203.210.191', presence=' <presence type="unavailable"/>'}, Netty channel handler context name: NettyClientConnectionHandler#0} java.io.IOException: Connection reset by peer at sun.nio.ch.FileDispatcherImpl.read0(Native Method) ~[?:?] at sun.nio.ch.SocketDispatcher.read(SocketDispatcher.java:39) ~[?:?] at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:276) ~[?:?] at sun.nio.ch.IOUtil.read(IOUtil.java:233) ~[?:?] at sun.nio.ch.IOUtil.read(IOUtil.java:223) ~[?:?] at sun.nio.ch.SocketChannelImpl.read(SocketChannelImpl.java:356) ~[?:?] at io.netty.buffer.PooledByteBuf.setBytes(PooledByteBuf.java:255) ~[netty-buffer-4.1.108.Final.jar:4.1.108.Final] at io.netty.buffer.AbstractByteBuf.writeBytes(AbstractByteBuf.java:1132) ~[netty-buffer-4.1.108.Final.jar:4.1.108.Final] at io.netty.channel.socket.nio.NioSocketChannel.doReadBytes(NioSocketChannel.java:357) ~[netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:151) [netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:788) [netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724) [netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650) [netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562) [netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) [netty-common-4.1.108.Final.jar:4.1.108.Final] at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) [netty-common-4.1.108.Final.jar:4.1.108.Final] at java.lang.Thread.run(Thread.java:829) [?:?] 2025.10.03 01:29:11.525 ERROR [socket_c2s-thread-7]: org.jivesoftware.openfire.nio.NettyConnection - Problem during connection close or cleanup java.io.IOException: Broken pipe at sun.nio.ch.FileDispatcherImpl.write0(Native Method) ~[?:?] at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) ~[?:?] at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:113) ~[?:?] at sun.nio.ch.IOUtil.write(IOUtil.java:58) ~[?:?] at sun.nio.ch.IOUtil.write(IOUtil.java:50) ~[?:?] at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:462) ~[?:?] at io.netty.channel.socket.nio.NioSocketChannel.doWrite(NioSocketChannel.java:415) ~[netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.channel.AbstractChannel$AbstractUnsafe.flush0(AbstractChannel.java:931) ~[netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.flush0(AbstractNioChannel.java:359) ~[netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.channel.AbstractChannel$AbstractUnsafe.flush(AbstractChannel.java:895) ~[netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.channel.DefaultChannelPipeline$HeadContext.flush(DefaultChannelPipeline.java:1372) ~[netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeFlush0(AbstractChannelHandlerContext.java:935) ~[netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeFlush(AbstractChannelHandlerContext.java:921) ~[netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.channel.AbstractChannelHandlerContext.flush(AbstractChannelHandlerContext.java:907) ~[netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.channel.ChannelOutboundHandlerAdapter.flush(ChannelOutboundHandlerAdapter.java:125) ~[netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeFlush0(AbstractChannelHandlerContext.java:939) ~[netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeWriteAndFlush(AbstractChannelHandlerContext.java:957) ~[netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.channel.AbstractChannelHandlerContext.write(AbstractChannelHandlerContext.java:982) ~[netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.channel.AbstractChannelHandlerContext.writeAndFlush(AbstractChannelHandlerContext.java:950) ~[netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.channel.AbstractChannelHandlerContext.writeAndFlush(AbstractChannelHandlerContext.java:1000) ~[netty-transport-4.1.108.Final.jar:4.1.108.Final] at org.jivesoftware.openfire.nio.NettyConnection.close(NettyConnection.java:215) [xmppserver-4.9.2.jar:4.9.2] at org.jivesoftware.openfire.nio.NettyXMPPDecoder.exceptionCaught(NettyXMPPDecoder.java:71) [xmppserver-4.9.2.jar:4.9.2] at io.netty.channel.AbstractChannelHandlerContext.invokeExceptionCaught(AbstractChannelHandlerContext.java:346) [netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeExceptionCaught(AbstractChannelHandlerContext.java:325) [netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.channel.AbstractChannelHandlerContext.fireExceptionCaught(AbstractChannelHandlerContext.java:317) [netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.channel.DefaultChannelPipeline$HeadContext.exceptionCaught(DefaultChannelPipeline.java:1377) [netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeExceptionCaught(AbstractChannelHandlerContext.java:346) [netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.channel.AbstractChannelHandlerContext.invokeExceptionCaught(AbstractChannelHandlerContext.java:325) [netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.channel.DefaultChannelPipeline.fireExceptionCaught(DefaultChannelPipeline.java:907) [netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.handleReadException(AbstractNioByteChannel.java:125) [netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:177) [netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:788) [netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724) [netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650) [netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562) [netty-transport-4.1.108.Final.jar:4.1.108.Final] at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) [netty-common-4.1.108.Final.jar:4.1.108.Final] at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) [netty-common-4.1.108.Final.jar:4.1.108.Final] at java.lang.Thread.run(Thread.java:829) [?:?]
最新发布
10-04
🎉 你已经完成了 Openfire 插件的部署,但现在遇到了一个**关键问题**:插件没有被加载! --- ## ❌ 核心错误(来自日志): ```text 2025.10.03 01:25:57.988 WARN [PluginMonitorExec-2]: org.jivesoftware.openfire.container.PluginManager - Ignoring plugin 'restapi-openfire-plugin-assembly': requires server version 5.0.0. Current server version is 4.9.2. ``` --- ## 🔴 问题分析 ### ✅ 你做了什么? - 成功构建了 `restAPI-openfire-plugin-assembly.jar` - 正确复制到了 `/opt/openfire/plugins/` 目录 - 文件名正确(Maven 构建无误) ### ❌ 为什么没生效? > **你的 Openfire 版本是 `4.9.2`,但这个插件要求 `Openfire 5.0.0+`** 所以 Openfire 主动 **忽略了该插件**,根本不会尝试加载它。 --- ## 📁 插件命名说明 你看到的日志中提示的是: ``` Ignoring plugin 'restapi-openfire-plugin-assembly' ``` 注意: - 实际文件名是:`restAPI-openfire-plugin-assembly.jar` - Openfire 自动去掉 `.jar` 后缀,并将名称转为小写 → `restapi-openfire-plugin-assembly` 这是正常行为。 但由于版本不兼容,**连解压都不会进行**,因此: ```bash ls /opt/openfire/plugins/restapi/ ``` 输出: ```text No such file or directory ``` ✅ 完全合理 —— 因为插件从未被加载,所以没有创建解压目录。 --- ## 💡 如何确认当前 Openfire 版本? 运行: ```bash cat /opt/openfire/resources/database/openfire_version.xml | grep "version" ``` 或查看管理界面底部信息。 你已知是 `4.9.2`,低于所需版本 `5.0.0`。 --- ## ✅ 解决方案(三种选择) --- ### ✅ 方案一:【推荐】降级插件源码以支持 Openfire 4.9.2 你需要修改插件项目的 `pom.xml` 中对 Openfire API 的依赖版本。 #### 修改前(可能): ```xml <dependency> <groupId>org.igniterealtime.openfire</groupId> <artifactId>openfire-core</artifactId> <version>5.0.0</version> <scope>provided</scope> </dependency> ``` #### 修改后(适配 4.9.2): ```xml <dependency> <groupId>org.igniterealtime.openfire</groupId> <artifactId>openfire</artifactId> <version>4.9.2</version> <scope>provided</scope> </dependency> ``` 然后重新编译: ```bash mvn clean install ``` 再部署新生成的 JAR。 --- ### ✅ 方案二:升级 Openfire 到 5.0.0 或更高版本 前往官网下载最新版 Openfire: 🔗 https://www.igniterealtime.org/projects/openfire/ 备份配置和数据库后升级: ```bash # 备份重要数据 cp -r /opt/openfire/conf /backup/openfire-conf-backup cp /opt/openfire/embedded-db/openfire.script /backup/ # 停止旧服务 systemctl stop openfire # 安装新版 Openfire(例如通过 RPM 或 tar.gz) # 然后恢复 conf 和数据库 ``` 升级后,你的插件应该能直接使用。 --- ### ✅ 方案三:检查是否有兼容 Openfire 4.9.2 的插件版本 有些项目会维护多个分支。例如: - `main` 分支 → 支持 Openfire 5.x - `openfire-4.x` 分支 → 支持 Openfire 4.9.x 查看 GitHub/GitLab 仓库是否有类似分支: ```bash git branch -r | grep -i openfire ``` 切换并重新构建: ```bash git checkout origin/openfire-4.x mvn clean install ``` --- ## 🔍 补充建议:如何查看插件所需的最低 Openfire 版本? 通常在以下两个位置定义: ### 1. `plugin.xml` 文件(打包在 JAR 内部) 你可以查看: ```bash jar -xf target/restAPI-openfire-plugin-assembly.jar plugin.xml cat plugin.xml ``` 查找字段: ```xml <minServerVersion>5.0.0</minServerVersion> ``` 这就是导致你被拒绝的原因! 👉 修改这个值为 `<minServerVersion>4.9.2</minServerVersion>` 并重新打包,可以“绕过”检查(但仅当实际代码也兼容时才安全)。 --- ## ✅ 总结 | 项目 | 内容 | |------|------| | ❌ 问题原因 | 插件要求 Openfire 5.0.0,当前运行的是 4.9.2 | | 🚫 结果 | 插件被忽略,未解压、未加载 | | ✅ 解决方法 | (1) 修改依赖降级插件;(2) 升级 Openfire;(3) 使用兼容分支 | | 🔍 验证方式 | 查看 `plugin.xml` 中的 `<minServerVersion>` 字段 | ---
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值