寻找Hadoop Job 提交类r的Log4j.properties

本文探讨了Hadoop Job提交过程中Driver内的Log4j配置问题,指出因多个log4j.properties文件导致的日志级别配置失效现象,并提供了解决方案。

名词说明:Hadoop Job 提交类,本文成为Driver

 

正文:

有时候我们需要在Driver里面写些逻辑代码,需要用到Log4j,但是当使用Hadoop jar xxxx.jar提交mapreduce的时候,发现Driver中的log并没有安装自己预期打印出来。通过打印classpath中log4j.properties文件路径,发现driver运行时,classpath中有多个log4j.properties文件,如下:

 

 

/etc/hadoop-0.20/conf.empty/log4j.properties
file:/usr/lib/hadoop-0.20/hadoop-core-0.20.2-cdh3u4.jar!/log4j.properties
/tmp/hadoop/hadoop/hadoop-unjar1904689375813052409/log4j.properties
file:/Application/adam/publish/CLA_Hadoop_publish.jar!/log4j.properties
/Application/adam/publish/conf/log4j.properties

 

而log4j只能加载第一个遇到的配置,所以我们mapreduce项目中的log4j.properties文件对于driver是不生效的。

 

很悲催,我们只能修改"/etc/hadoop-0.20/conf.empty/log4j.properties"也就是我们的hadoop程序conf下的log4j.properties文件来实现相关功能,所以大家使用的时候要小心点。我的一个配置样例如下:

 


log4j.logger.net.sf.json.JSONObject=ERROR
log4j.logger.org.apache.commons.httpclient.HttpMethodBase=ERROR

 

当然也可以通过程序代码SetLevel方法设置打印层级。

 

另外,附上寻找log4j.properties文件的方法:

 

Enumeration<URL> resources = this.getClass().getClassLoader().getResources("log4j.properties");
while (resources.hasMoreElements()) {
	URL url = (URL) resources.nextElement();
	System.out.println(url.getPath());
}
 

 

--heipark

 

 

log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. Exception in thread "main" java.io.IOException: (null) entry in command string: null chmod 0700 C:\tmp\hadoop-胡\mapred\staging\º467108493\.staging at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:762) at org.apache.hadoop.util.Shell.execCommand(Shell.java:859) at org.apache.hadoop.util.Shell.execCommand(Shell.java:842) at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:661) at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:449) at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:293) at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:133) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:145) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1297) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1294) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1294) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1315) at cn.edu.abtu.job.WordCountJob.main(WordCountJob.java:34) Process finished with exit code 1
05-07
SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/C:/Users/Administrator/.m2/repository/org/slf4j/slf4j-log4j12/1.7.30/slf4j-log4j12-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/C:/Users/Administrator/.m2/repository/org/apache/logging/log4j/log4j-slf4j-impl/2.10.0/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/C:/Users/Administrator/.m2/repository/org/slf4j/slf4j-simple/1.7.30/slf4j-simple-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 2025-11-24 09:19:23,217 WARN [org.apache.hadoop.metrics2.impl.MetricsConfig] - Cannot locate configuration: tried hadoop-metrics2-jobtracker.properties,hadoop-metrics2.properties 2025-11-24 09:19:23,257 INFO [org.apache.hadoop.metrics2.impl.MetricsSystemImpl] - Scheduled Metric snapshot period at 10 second(s). 2025-11-24 09:19:23,258 INFO [org.apache.hadoop.metrics2.impl.MetricsSystemImpl] - JobTracker metrics system started Exception in thread "main" org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory file:/D:/dgh/java/Hadoop/output already exists at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:164) at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:277) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:143) at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1570) at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1567) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1567) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1588) at com.hadoop.mapreduce.Dgh20246030853Driver.main(Dgh20246030853Driver.java:24) Process finished with exit code 1 这是运行后的结果
最新发布
11-25
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值