Config log4j in standalone app

本文将指导您如何正确地配置Log4j的XML文件和属性文件,包括设置控制台输出,调整日志级别,并展示实际示例。

1. First make sure you've added the log4j.jar

2. config log4j.xml and log4j.properties

Show an example:

log4j.xml

<?xml version="1.0" encoding="UTF-8" ?>
<!DOCTYPE log4j:configuration SYSTEM "log4j.dtd">
<log4j:configuration xmlns:log4j='http://jakarta.apache.org/log4j/'>
	<appender name="CA" class="org.apache.log4j.ConsoleAppender">
		<layout class="org.apache.log4j.PatternLayout">
			<param name="ConversionPattern" value="%-4r [%t] %-5p %c %x - %m%n" />
		</layout>
	</appender>
	<root>
		<level value="debug" />
		<appender-ref ref="CA" />
	</root>
</log4j:configuration>

log4j.properties

log4j.rootLogger=DEBUG, CA
log4j.appender.CA=org.apache.log4j.ConsoleAppender
log4j.appender.CA.layout=org.apache.log4j.PatternLayout
log4j.appender.CA.layout.ConversionPattern=%-4r [%t] %-5p %c %x - %m%n

Done!

http://www.dzone.com/tutorials/java/log4j/log4j-xml-configuration.html

./.hbase-metrics-2.0.0-alpha4.jar.crc 396293 168 -r-x------ 1 hadoop hadoop 170621 Dec 11 14:57 ./calcite-druid-1.16.0.jar 396307 736 -r-x------ 1 hadoop hadoop 751238 Dec 11 14:57 ./commons-collections4-4.1.jar 396451 1008 -r-x------ 1 hadoop hadoop 1032099 Dec 11 14:57 ./hive-serde-3.1.3.jar 396687 644 -r-x------ 1 hadoop hadoop 655685 Dec 11 14:57 ./scala-xml_2.11-1.0.1.jar 396266 4 -rw-r--r-- 1 hadoop hadoop 40 Dec 11 14:57 ./.apache-jstl-9.3.20.v20170531.jar.crc 396454 8 -rw-r--r-- 1 hadoop hadoop 4448 Dec 11 14:57 ./.hive-service-3.1.3.jar.crc 396290 4 -rw-r--r-- 1 hadoop hadoop 876 Dec 11 14:57 ./.bonecp-0.8.0.RELEASE.jar.crc 396737 188 -r-x------ 1 hadoop hadoop 191717 Dec 11 14:58 ./websocket-common-9.3.20.v20170531.jar 396287 184 -r-x------ 1 hadoop hadoop 187052 Dec 11 14:57 ./avro-mapred-1.8.2-hadoop2.jar 396458 4 -rw-r--r-- 1 hadoop hadoop 444 Dec 11 14:57 ./.hive-shims-0.23-3.1.3.jar.crc 396295 484 -r-x------ 1 hadoop hadoop 493616 Dec 11 14:57 ./calcite-linq4j-1.16.0.jar 396433 56 -r-x------ 1 hadoop hadoop 55398 Dec 11 14:57 ./hive-jdbc-handler-3.1.3.jar 396496 4 -rw-r--r-- 1 hadoop hadoop 600 Dec 11 14:57 ./.jackson-annotations-2.12.0.jar.crc 396402 24 -rw-r--r-- 1 hadoop hadoop 24104 Dec 11 14:57 ./.hbase-shaded-miscellaneous-1.0.1.jar.crc 396306 8 -rw-r--r-- 1 hadoop hadoop 4608 Dec 11 14:57 ./.commons-collections-3.2.2.jar.crc 396749 2280 -r-x------ 1 hadoop hadoop 2333186 Dec 11 14:58 ./zstd-jni-1.3.2-2.jar 396418 4 -rw-r--r-- 1 hadoop hadoop 1020 Dec 11 14:57 ./.hive-contrib-3.1.3.jar.crc 396249 16 -r-x------ 1 hadoop hadoop 15052 Dec 11 14:57 ./accumulo-trace-1.7.3.jar 396270 4 -rw-r--r-- 1 hadoop hadoop 628 Dec 11 14:57 ./.arrow-memory-0.8.0.jar.crc 396712 4 -rw-r--r-- 1 hadoop hadoop 216 Dec 11 14:58 ./.stax-api-1.0.1.jar.crc 396488 12 -rw-r--r-- 1 hadoop hadoop 11604 Dec 11 14:57 ./.htrace-core-3.2.0-incubating.jar.crc 396681 5612 -r-x------ 1 hadoop hadoop 5744974 Dec 11 14:57 ./scala-library-2.11.8.jar 396301 44 -r-x------ 1 hadoop hadoop 41123 Dec 11 14:57 ./commons-cli-1.2.jar 396329 96 -r-x------ 1 hadoop hadoop 96221 Dec 11 14:57 ./commons-pool-1.5.4.jar 396493 1256 -r-x------ 1 hadoop hadoop 1282424 Dec 11 14:57 ./ivy-2.4.0.jar 396707 48 -r-x------ 1 hadoop hadoop 48464 Dec 11 14:58 ./spark-unsafe_2.11-2.3.0.jar 396558 4 -rw-r--r-- 1 hadoop hadoop 1188 Dec 11 14:57 ./.jetty-http-9.3.20.v20170531.jar.crc 396645 268 -r-x------ 1 hadoop hadoop 270353 Dec 11 14:57 ./netty-buffer-4.1.17.Final.jar 396530 4 -rw-r--r-- 1 hadoop hadoop 3096 Dec 11 14:57 ./.javolution-5.5.1.jar.crc 396565 56 -r-x------ 1 hadoop hadoop 55879 Dec 11 14:57 ./jetty-plus-9.3.20.v20170531.jar 396339 280 -r-x------ 1 hadoop hadoop 283598 Dec 11 14:57 ./curator-recipes-2.12.0.jar 396340 4 -rw-r--r-- 1 hadoop hadoop 2224 Dec 11 14:57 ./.curator-recipes-2.12.0.jar.crc 396779 4 -rw-r--r-- 1 hadoop hadoop 12 Dec 11 14:58 ./.container_tokens.crc 396468 88 -rw-r--r-- 1 hadoop hadoop 86700 Dec 11 14:57 ./.hive-standalone-metastore-3.1.3.jar.crc 396471 68 -r-x------ 1 hadoop hadoop 66084 Dec 11 14:57 ./hive-streaming-3.1.3.jar 396438 4 -rw-r--r-- 1 hadoop hadoop 1100 Dec 11 14:57 ./.hive-llap-client-3.1.3.jar.crc 396409 180 -r-x------ 1 hadoop hadoop 180433 Dec 11 14:57 ./hive-beeline-3.1.3.jar 396730 4 -rw-r--r-- 1 hadoop hadoop 508 Dec 11 14:58 ./.validation-api-1.1.0.Final.jar.crc 396366 4 -rw-r--r-- 1 hadoop hadoop 1496 Dec 11 14:57 ./.gson-2.2.4.jar.crc broken symlinks(find -L . -maxdepth 5 -type l -ls): Log Type: launch_container.sh Log Upload Time: Thu Dec 11 06:59:18 +0000 2025 Log Length: 68980 Showing 4096 bytes of 68980 total. Click here for the full log. /libjars/.libthrift-0.9.3.jar.crc" ".libthrift-0.9.3.jar.crc" ln -sf -- "/data/nm-local/usercache/hadoop/filecache/10/libjars/.avro-ipc-1.8.2.jar.crc" ".avro-ipc-1.8.2.jar.crc" ln -sf -- "/data/nm-local/usercache/hadoop/filecache/10/libjars/scala-library-2.11.8.jar" "scala-library-2.11.8.jar" ln -sf -- "/data/nm-local/usercache/hadoop/filecache/10/libjars/.spark-network-common_2.11-2.3.0.jar.crc" ".spark-network-common_2.11-2.3.0.jar.crc" ln -sf -- "/data/nm-local/usercache/hadoop/filecache/10/libjars/.hive-hcatalog-server-extensions-3.1.3.jar.crc" ".hive-hcatalog-server-extensions-3.1.3.jar.crc" ln -sf -- "/data/nm-local/usercache/hadoop/filecache/10/libjars/.asm-commons-5.0.1.jar.crc" ".asm-commons-5.0.1.jar.crc" ln -sf -- "/data/nm-local/usercache/hadoop/filecache/10/libjars/mysql-connector-java-5.1.34.jar" "mysql-connector-java-5.1.34.jar" ln -sf -- "/data/nm-local/usercache/hadoop/filecache/10/libjars/gson-2.2.4.jar" "gson-2.2.4.jar" ln -sf -- "/data/nm-local/usercache/hadoop/filecache/10/libjars/janino-2.7.6.jar" "janino-2.7.6.jar" ln -sf -- "/data/nm-local/usercache/hadoop/filecache/10/libjars/.hive-service-rpc-3.1.3.jar.crc" ".hive-service-rpc-3.1.3.jar.crc" ln -sf -- "/data/nm-local/usercache/hadoop/filecache/10/libjars/.websocket-server-9.3.20.v20170531.jar.crc" ".websocket-server-9.3.20.v20170531.jar.crc" ln -sf -- "/data/nm-local/usercache/hadoop/filecache/10/libjars/.hive-serde-3.1.3.jar.crc" ".hive-serde-3.1.3.jar.crc" ln -sf -- "/data/nm-local/usercache/hadoop/filecache/10/libjars/hbase-mapreduce-2.0.0-alpha4.jar" "hbase-mapreduce-2.0.0-alpha4.jar" ln -sf -- "/data/nm-local/usercache/hadoop/filecache/10/libjars/hbase-procedure-2.0.0-alpha4.jar" "hbase-procedure-2.0.0-alpha4.jar" ln -sf -- "/data/nm-local/usercache/hadoop/filecache/10/libjars/xz-1.5.jar" "xz-1.5.jar" ln -sf -- "/data/nm-local/usercache/hadoop/filecache/10/libjars/stax-api-1.0.1.jar" "stax-api-1.0.1.jar" ln -sf -- "/data/nm-local/usercache/hadoop/filecache/10/libjars/.curator-client-2.12.0.jar.crc" ".curator-client-2.12.0.jar.crc" ln -sf -- "/data/nm-local/usercache/hadoop/filecache/10/libjars/joni-2.1.11.jar" "joni-2.1.11.jar" ln -sf -- "/data/nm-local/usercache/hadoop/filecache/10/libjars/.ST4-4.0.4.jar.crc" ".ST4-4.0.4.jar.crc" ln -sf -- "/data/nm-local/usercache/hadoop/filecache/10/libjars/sketches-core-0.9.0.jar" "sketches-core-0.9.0.jar" echo "Copying debugging information" # Creating copy of launch script cp "launch_container.sh" "/data/nm-log/application_1765436079986_0001/container_1765436079986_0001_01_000001/launch_container.sh" chmod 640 "/data/nm-log/application_1765436079986_0001/container_1765436079986_0001_01_000001/launch_container.sh" # Determining directory contents echo "ls -l:" 1>"/data/nm-log/application_1765436079986_0001/container_1765436079986_0001_01_000001/directory.info" ls -l 1>>"/data/nm-log/application_1765436079986_0001/container_1765436079986_0001_01_000001/directory.info" echo "find -L . -maxdepth 5 -ls:" 1>>"/data/nm-log/application_1765436079986_0001/container_1765436079986_0001_01_000001/directory.info" find -L . -maxdepth 5 -ls 1>>"/data/nm-log/application_1765436079986_0001/container_1765436079986_0001_01_000001/directory.info" echo "broken symlinks(find -L . -maxdepth 5 -type l -ls):" 1>>"/data/nm-log/application_1765436079986_0001/container_1765436079986_0001_01_000001/directory.info" find -L . -maxdepth 5 -type l -ls 1>>"/data/nm-log/application_1765436079986_0001/container_1765436079986_0001_01_000001/directory.info" echo "Launching container" exec /bin/bash -c "$JAVA_HOME/bin/java -Djava.io.tmpdir=$PWD/tmp -Dlog4j.configuration=container-log4j.properties -Dyarn.app.container.log.dir=/data/nm-log/application_1765436079986_0001/container_1765436079986_0001_01_000001 -Dyarn.app.container.log.filesize=0 -Dhadoop.root.logger=INFO,CLA -Dhadoop.root.logfile=syslog -Xmx1024m org.apache.hadoop.mapreduce.v2.app.MRAppMaster 1>/data/nm-log/application_1765436079986_0001/container_1765436079986_0001_01_000001/stdout 2>/data/nm-log/application_1765436079986_0001/container_1765436079986_0001_01_000001/stderr " Log Type: prelaunch.err Log Upload Time: Thu Dec 11 06:59:18 +0000 2025 Log Length: 0 Log Type: prelaunch.out Log Upload Time: Thu Dec 11 06:59:18 +0000 2025 Log Length: 100 Setting up env variables Setting up job resources Copying debugging information Launching container Log Type: stderr Log Upload Time: Thu Dec 11 06:59:18 +0000 2025 Log Length: 756 SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/export/server/hadoop-3.4.2/share/hadoop/common/lib/slf4j-reload4j-1.7.36.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/data/nm-local/usercache/hadoop/filecache/10/libjars/log4j-slf4j-impl-2.17.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Reload4jLoggerFactory] log4j:WARN No appenders could be found for logger (org.apache.hadoop.mapreduce.v2.app.MRAppMaster). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. Log Type: stdout Log Upload Time: Thu Dec 11 06:59:18 +0000 2025 Log Length: 0 Log Type: syslog Log Upload Time: Thu Dec 11 06:59:18 +0000 2025 Log Length: 67826 Showing 4096 bytes of 67826 total. Click here for the full log. history data to the timeline server is not enabled 2025-12-11 06:58:19,718 INFO [main] org.apache.hadoop.yarn.event.AsyncDispatcher: Registering class org.apache.hadoop.mapreduce.v2.app.job.event.JobFinishEvent$Type for class org.apache.hadoop.mapreduce.v2.app.MRAppMaster$JobFinishEventHandler 2025-12-11 06:58:23,076 ERROR [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Error starting MRAppMaster java.lang.NoSuchMethodError: org.apache.log4j.helpers.OptionConverter.convertLevel(Ljava/lang/String;Lorg/apache/logging/log4j/Level;)Lorg/apache/logging/log4j/Level; at org.apache.log4j.config.PropertiesConfiguration.parseLogger(PropertiesConfiguration.java:393) at org.apache.log4j.config.PropertiesConfiguration.configureRoot(PropertiesConfiguration.java:326) at org.apache.log4j.config.PropertiesConfiguration.doConfigure(PropertiesConfiguration.java:303) at org.apache.log4j.config.PropertiesConfiguration.doConfigure(PropertiesConfiguration.java:93) at org.apache.log4j.config.Log4j1Configuration.initialize(Log4j1Configuration.java:60) at org.apache.logging.log4j.core.config.AbstractConfiguration.start(AbstractConfiguration.java:293) at org.apache.logging.log4j.core.LoggerContext.setConfiguration(LoggerContext.java:626) at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:699) at org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:716) at org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:270) at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:155) at org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:47) at org.apache.logging.log4j.LogManager.getContext(LogManager.java:196) at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:137) at org.apache.commons.logging.impl.Log4jApiLogFactory$LogAdapter.getContext(Log4jApiLogFactory.java:161) at org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:47) at org.apache.commons.logging.impl.Log4jApiLogFactory.getInstance(Log4jApiLogFactory.java:203) at org.apache.commons.logging.impl.Log4jApiLogFactory.getInstance(Log4jApiLogFactory.java:198) at org.apache.commons.logging.LogFactory.getLog(LogFactory.java:987) at org.apache.commons.configuration2.io.FileLocatorUtils.<clinit>(FileLocatorUtils.java:81) at org.apache.commons.configuration2.io.FileHandler.emptyFileLocator(FileHandler.java:266) at org.apache.commons.configuration2.io.FileHandler.<init>(FileHandler.java:188) at org.apache.hadoop.metrics2.impl.MetricsConfig.loadFirst(MetricsConfig.java:118) at org.apache.hadoop.metrics2.impl.MetricsConfig.create(MetricsConfig.java:99) at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.configure(MetricsSystemImpl.java:481) at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.start(MetricsSystemImpl.java:188) at org.apache.hadoop.metrics2.impl.MetricsSystemImpl.init(MetricsSystemImpl.java:163) at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.init(DefaultMetricsSystem.java:62) at org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.initialize(DefaultMetricsSystem.java:58) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceStart(MRAppMaster.java:1253) at org.apache.hadoop.service.AbstractService.start(AbstractService.java:195) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$6.run(MRAppMaster.java:1768) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1953) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMaster(MRAppMaster.java:1764) at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1685) 2025-12-11 06:58:23,094 INFO [main] org.apache.hadoop.util.ExitUtil: Exiting with status 1: java.lang.NoSuchMethodError: org.apache.log4j.helpers.OptionConverter.convertLevel(Ljava/lang/String;Lorg/apache/logging/log4j/Level;)Lorg/apache/logging/log4j/Level;
12-12
内容概要:本文为《科技类企业品牌传播白皮书》,系统阐述了新闻媒体发稿、自媒体博主种草与短视频矩阵覆盖三大核心传播策略,并结合“传声港”平台的AI工具与资源整合能力,提出适配科技企业的品牌传播解决方案。文章深入分析科技企业传播的特殊性,包括受众圈层化、技术复杂性与传播通俗性的矛盾、产品生命周期影响及2024-2025年传播新趋势,强调从“技术输出”向“价值引领”的战略升级。针对三种传播方式,分别从适用场景、操作流程、效果评估、成本效益、风险防控等方面提供详尽指南,并通过平台AI能力实现资源智能匹配、内容精准投放与全链路效果追踪,最终构建“信任—种草—曝光”三位一体的传播闭环。; 适合人群:科技类企业品牌与市场负责人、公关传播从业者、数字营销管理者及初创科技公司创始人;具备一定品牌传播基础,关注效果可量化与AI工具赋能的专业人士。; 使用场景及目标:①制定科技产品全生命周期的品牌传播策略;②优化媒体发稿、KOL合作与短视频运营的资源配置与ROI;③借助AI平台实现传播内容的精准触达、效果监测与风险控制;④提升品牌在技术可信度、用户信任与市场影响力方面的综合竞争力。; 阅读建议:建议结合传声港平台的实际工具模块(如AI选媒、达人匹配、数据驾驶舱)进行对照阅读,重点关注各阶段的标准化流程与数据指标基准,将理论策略与平台实操深度融合,推动品牌传播从经验驱动转向数据与工具双驱动。
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值