Xml解析:关于“The processing instruction must begin with the name of the target”

本文解决在使用EclipseJmerge时遇到的XML解析错误问题,通过分析并修正XML文件中的细微错误,使得程序运行恢复正常。同时提供了解读错误信息的方法,帮助开发者快速定位并解决问题。

原博客地址:http://hi.baidu.com/caoxmbd/blog/item/7340e8974afd076b55fb96f3.html


  今天试用eclipse jmerge的时候老是报解析merge.xml出错“The processing instruction must begin with the name of the target”,看了几遍xml文件的内容也没发现什么特别,在网上搜了搜发现还是个通病,很多人都遇到过这种问题。其原因是

<? xml version = "1.0" encoding = "UTF-8" ?> 的问号和"xml"之间多了个空格,去掉空格后<?xml version = "1.0" encoding = "UTF-8" ?>就ok了。

提示信息“The processing instruction must begin with the name of the target”让人太费解了。


同理

<? xml version="1.0" encoding="UTF-8" ?>  
<! DOCTYPE sqlMapConfig  PUBLIC "-//iBATIS.com//DTD SQL Map Config 2.0//EN"
  "http://www.ibatis.com/dtd/sql-map-config-2.dtd" > 

也需要修改成:

<?xml version="1.0" encoding="UTF-8" ?>  
<!DOCTYPE sqlMapConfig  PUBLIC "-//iBATIS.com//DTD SQL Map Config 2.0//EN"
  "http://www.ibatis.com/dtd/sql-map-config-2.dtd" > 

[root@hadoop01 conf]# hive [Fatal Error] hive-site.xml:1:3: The processing instruction must begin with the name of the target. 25/06/11 04:09:28 FATAL conf.Configuration: error parsing conf file:/opt/programs/apache-hive-1.2.2-bin/conf/hive-site.xml org.xml.sax.SAXParseException; systemId: file:/opt/programs/apache-hive-1.2.2-bin/conf/hive-site.xml; lineNumber: 1; columnNumber: 3; The processing instruction must begin with the name of the target. at org.apache.xerces.parsers.DOMParser.parse(Unknown Source) at org.apache.xerces.jaxp.DocumentBuilderImpl.parse(Unknown Source) at javax.xml.parsers.DocumentBuilder.parse(DocumentBuilder.java:150) at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2552) at org.apache.hadoop.conf.Configuration.parse(Configuration.java:2540) at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2611) at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2574) at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2477) at org.apache.hadoop.conf.Configuration.get(Configuration.java:1302) at org.apache.hadoop.hive.conf.HiveConf.getVar(HiveConf.java:2615) at org.apache.hadoop.hive.conf.HiveConf.getVar(HiveConf.java:2636) at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:2707) at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:2651) at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:74) at org.apache.hadoop.hive.common.LogUtils.initHiveLog4j(LogUtils.java:58) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:637) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.util.RunJar.run(RunJar.jav
最新发布
06-12
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值