解决Failed to parse SourceMap: http:xxx 问题

最近在调试web网站的时候,看到Chrome的Dev Tool的控制台打印出来一个莫名的错误,类似于Failed to parse SourceMap: http:xxx的提示。感觉这应该不是javascript代码出的问题,于是Google一下,发现里面大有文章。

 

这里简单说一下SourceMap这项技术。在现在写网站的时候,如果有很多的javascript文件,拿ASP.NET NVC来说,在渲染内容到客户端浏览器的时候,如果你使用一些Bundle类的话,能够合并并且压缩那些js文件,并且去除其中的空格等元素,从而减小文件的容量,提高网页的反应速度。但是这样一样造成的问题就是在浏览器端无法对js文件进行调试,因为进过压缩的文件很紧凑,没有空格与换行。

 

这里先说一个额外的话题,就是怎么把压缩后的js文件还原成格式化的代码。这里有一个在线的工具就可以完成。http://jsbeautifier.org/,打开网址后,输入压缩后的js代码,然后点击Ctrl+Enter,自动就给你格式化了,很方便。

 

上边说到的是如何格式化,就是把代码拷贝出来进行格式化,跟调试还不一样。想在调试的时候能在浏览器端显示格式化的js代码,这里就引入了SourceMap的概念。

 

概括来讲,SourceMap就是如何把压缩后的js代码映射成格式化代码的方法。当你在Production环境部署代码的时候,伴随着压缩与优化话的js代码,还要有一个包含原始js代码的sourcemap文件。当客户端浏览器Chrome在收到这个压缩后的js文件后,它会自动的去寻找服务器上相关的sourcmap文件并把压缩的js代码转换成格式规范的js代码。

 

Chrome要实现这个功能,首先要开启这个设置。在Dev Tool的设置框中,勾选相应的项。注意,不是Chrome的设置页面,是DevTool的设置页面。

 

这里来说一说SourceMap到底是怎么工作的。首先我们要有相应的工具对已有的js文件进行压缩,压缩的同时会产生相应的sourcemap文件,然后再部署到服务器就可以了。目前不是所有的压缩工具都符合要求,比如说ASP.NET MVC里边的那个Bundle类就不可以。现在最常用的是Closure compiler,在工具产生压缩js文件之后,会在文件头部加上这样一行代码,指明了服务器上的哪个文件是map文件。这个文件只有在浏览器打开了上边提到的DevTool里边SourceMap的设置,才会下载到本地,否则不会下载。这里就显示出了其优势,一般用户不用下载,其网页只用优化后的js文件,响应速度快。对于开发者,发者可以下载map文件进行调试。

 

 

 

现在回到最初的问题,为什么我的网站会抛出那个异常?

 

 

 

 

看提示可以明白,是因为网站引用的第三方的angular是压缩后的版本,而且有sourcemap的指向信息,而我的网站没有相应的map文件,所以抛出此异常。

 

 

参考:

http://www.html5rocks.com/en/tutorials/developertools/sourcemaps/

http://stackoverflow.com/questions/36051891/esri-failed-to-parse-source-map

 

FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask. Exception when loading 3 in table visit_consult_dwd with loadPath=hdfs://node1:8020/user/hive/warehouse/edu_dwd.db/visit_consult_dwd/.hive-staging_hive_2025-08-15_16-12-18_481_194786052990693104-3/-ext-10000 MapReduce Jobs Launched: Stage-Stage-1: Map: 2 Reduce: 1 Cumulative CPU: 133.69 sec HDFS Read: 175238527 HDFS Write: 4517607 SUCCESS Total MapReduce CPU Time Spent: 2 minutes 13 seconds 690 msec FAILED: SemanticException [Error 10001]: Table not found visit_consult_dwd NoViableAltException(24@[]) at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1387) at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:220) at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:74) at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:67) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:616) at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1826) at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1773) at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1768) at org.apache.hadoop.hive.ql.reexec.ReExecDriver.compileAndRespond(ReExecDriver.java:126) at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:197) at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:260) at org.apache.hive.service.cli.operation.Operation.run(Operation.java:247) at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:541) at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:527) at sun.reflect.GeneratedMethodAccessor28.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78) at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36) at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1899) at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59) at com.sun.proxy.$Proxy37.executeStatementAsync(Unknown Source) at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:312) at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:562) at org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1557) at org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1542) at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56) at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) FAILED: ParseException line 1:0 cannot recognize input near 'hdfs' 'dfs' '-' NoViableAltException(24@[]) at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1387) at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:220) at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:74) at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:67) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:616) at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1826) at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1773) at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1768) at org.apache.hadoop.hive.ql.reexec.ReExecDriver.compileAndRespond(ReExecDriver.java:126) at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:197) at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:260) at org.apache.hive.service.cli.operation.Operation.run(Operation.java:247) at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:541) at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:527) at sun.reflect.GeneratedMethodAccessor28.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78) at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36) at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1899) at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59) at com.sun.proxy.$Proxy37.executeStatementAsync(Unknown Source) at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:312) at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:562) at org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1557) at org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1542) at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56) at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) FAILED: ParseException line 1:0 cannot recognize input near 'hdfs' 'dfs' '-' Query ID = root_20250815164645_d40751c9-4905-4463-bc30-a076c48c32af Total jobs = 2 Launching Job 1 out of 2 Number of reduce tasks not specified. Estimated from input data size: 1 In order to change the average load for a reducer (in bytes): set hive.exec.reducers.bytes.per.reducer=<number> In order to limit the maximum number of reducers: set hive.exec.reducers.max=<number> In order to set a constant number of reducers: set mapreduce.job.reduces=<number>
最新发布
08-16
评论 23
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值