执行spark-submit时,一直在Application report for application_1599648117906_0005 (state: ACCEPTED)刷新,解决方案

当执行Spark任务时,如果遇到Application report显示(state: ACCEPTED)并停滞不前,可能的原因包括:YARN集群未完全启动、集群时间不同步、内存分配不当或Spark配置错误。解决方案包括检查并启动YARN集群、同步集群时间、调整内存分配以及正确配置Spark的默认设置和环境变量。

执行任务时,卡在(state:ACCEPTED)这一直不动,

20/09/09 11:15:16 INFO yarn.Client: Application report for application_1599648117906_0004 (state: ACCEPTED)
20/09/09 11:15:17 INFO yarn.Client: Application report for application_1599648117906_0004 (state: ACCEPTED)
20/09/09 11:15:18 INFO yarn.Client: Application report for application_1599648117906_0004 (state: ACCEPTED)
20/09/09 11:15:19 INFO yarn.Client: Application report for application_1599648117906_0004 (state: ACCEPTED)
20/09/09 11:15:20 INFO yarn.Client: Application report for application_1599648117906_0004 (state: ACCEPTED)
20/09/09 11:15:21 INFO yarn.Client: Application report for application_1599648117906_0004 (state: ACCEPTED)
20/09/09 11:15:22 INFO yarn.Client: Application report for application_1599648117906_0004 (state: ACCEPTED)
20/09/09 11:15:23 INFO yarn.Client: Application report for application_1599648117906_0004 (state: ACCEPTED)
20/09/09 11:15:24 INFO yarn.Client: Application report for application_1599648117906_0004 (state: ACCEPTED)
20<
[root@master conf]# spark-submit --master yarn --class org.apache.spark.examples.SparkPi $SPARK_HOME/examples/jars/spark-examples_2.12-3.1.1.jar 25/11/12 04:52:13 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 25/11/12 04:52:14 INFO SparkContext: Running Spark version 3.1.1 25/11/12 04:52:14 INFO ResourceUtils: ============================================================== 25/11/12 04:52:14 INFO ResourceUtils: No custom resources configured for spark.driver. 25/11/12 04:52:14 INFO ResourceUtils: ============================================================== 25/11/12 04:52:14 INFO SparkContext: Submitted application: Spark Pi 25/11/12 04:52:14 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0) 25/11/12 04:52:14 INFO ResourceProfile: Limiting resource is cpus at 1 tasks per executor 25/11/12 04:52:14 INFO ResourceProfileManager: Added ResourceProfile id: 0 25/11/12 04:52:14 INFO SecurityManager: Changing view acls to: root 25/11/12 04:52:14 INFO SecurityManager: Changing modify acls to: root 25/11/12 04:52:14 INFO SecurityManager: Changing view acls groups to: 25/11/12 04:52:14 INFO SecurityManager: Changing modify acls groups to: 25/11/12 04:52:14 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set() 25/11/12 04:52:15 INFO Utils: Successfully started service 'sparkDriver' on port 34815. 25/11/12 04:52:15 INFO SparkEnv: Registering MapOutputTracker 25/11/12 04:52:15 INFO SparkEnv: Registering BlockManagerMaster 25/11/12 04:52:15 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 25/11/12 04:52:15 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 25/11/12 04:52:15 INFO SparkEnv: Registering BlockManagerMasterHeartbeat 25/11/12 04:52:15 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-3940abd9-f976-4673-ac49-5bffc32a9ec4 25/11/12 04:52:15 INFO MemoryStore: MemoryStore started with capacity 413.9 MiB 25/11/12 04:52:15 INFO SparkEnv: Registering OutputCommitCoordinator 25/11/12 04:52:15 INFO Utils: Successfully started service 'SparkUI' on port 4040. 25/11/12 04:52:15 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://master:4040 25/11/12 04:52:15 INFO SparkContext: Added JAR file:/opt/module/spark/examples/jars/spark-examples_2.12-3.1.1.jar at spark://master:34815/jars/spark-examples_2.12-3.1.1.jar with timestamp 1762894334373 25/11/12 04:52:16 INFO RMProxy: Connecting to ResourceManager at /0.0.0.0:8032 25/11/12 04:52:16 INFO Client: Requesting a new application from cluster with 3 NodeManagers 25/11/12 04:52:17 INFO Configuration: resource-types.xml not found 25/11/12 04:52:17 INFO ResourceUtils: Unable to find 'resource-types.xml'. 25/11/12 04:52:17 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per container) 25/11/12 04:52:17 INFO Client: Will allocate AM container, with 896 MB memory including 384 MB overhead 25/11/12 04:52:17 INFO Client: Setting up container launch context for our AM 25/11/12 04:52:17 INFO Client: Setting up the launch environment for our AM container 25/11/12 04:52:17 INFO Client: Preparing resources for our AM container 25/11/12 04:52:17 WARN Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME. 25/11/12 04:52:18 INFO Client: Uploading resource file:/tmp/spark-0eb147ed-0ede-4f24-90b9-65a8f65aa65b/__spark_libs__1298849544083591332.zip -> file:/root/.sparkStaging/application_1762894202540_0002/__spark_libs__1298849544083591332.zip 25/11/12 04:52:19 INFO Client: Uploading resource file:/tmp/spark-0eb147ed-0ede-4f24-90b9-65a8f65aa65b/__spark_conf__2273853142442443753.zip -> file:/root/.sparkStaging/application_1762894202540_0002/__spark_conf__.zip 25/11/12 04:52:19 INFO SecurityManager: Changing view acls to: root 25/11/12 04:52:19 INFO SecurityManager: Changing modify acls to: root 25/11/12 04:52:19 INFO SecurityManager: Changing view acls groups to: 25/11/12 04:52:19 INFO SecurityManager: Changing modify acls groups to: 25/11/12 04:52:19 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set() 25/11/12 04:52:19 INFO Client: Submitting application application_1762894202540_0002 to ResourceManager 25/11/12 04:52:20 INFO YarnClientImpl: Submitted application application_1762894202540_0002 25/11/12 04:52:21 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) 25/11/12 04:52:21 INFO Client: client token: N/A diagnostics: [星期三 十一月 12 04:52:20 +0800 2025] Scheduler has assigned a container for AM, waiting for AM container to be launched ApplicationMaster host: N/A ApplicationMaster RPC port: -1 queue: default start time: 1762894339943 final status: UNDEFINED tracking URL: http://master:8088/proxy/application_1762894202540_0002/ user: root 25/11/12 04:52:22 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) 25/11/12 04:52:23 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) 25/11/12 04:52:24 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) 25/11/12 04:52:25 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) 25/11/12 04:52:26 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) :25/11/12 04:52:27 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) 25/11/12 04:52:28 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) 25/11/12 04:52:29 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) ^C25/11/12 04:52:30 INFO DiskBlockManager: Shutdown hook called 25/11/12 04:52:30 INFO ShutdownHookManager: Shutdown hook called 25/11/12 04:52:30 INFO ShutdownHookManager: Deleting directory /tmp/spark-c213ed8f-d019-49b8-af3c-48d7d225c929 25/11/12 04:52:30 INFO ShutdownHookManager: Deleting directory /tmp/spark-0eb147ed-0ede-4f24-90b9-65a8f65aa65b/userFiles-0c002819-b899-477a-8142-ea070ae03495 25/11/12 04:52:30 INFO ShutdownHookManager: Deleting directory /tmp/spark-0eb147ed-0ede-4f24-90b9-65a8f65aa65b [root@master conf]# vi spark-env.sh [root@master conf]# spark-submit --master yarn --class org.apache.spark.examples.SparkPi $SPARK_HOME/examples/jars/spark-examples_2.12.jar 2025-11-12 04:52:57,994 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2025-11-12 04:52:58,381 WARN deploy.DependencyUtils: Local jar /opt/module/spark/examples/jars/spark-examples_2.12.jar does not exist, skipping. Error: Failed to load class org.apache.spark.examples.SparkPi. 2025-11-12 04:52:58,394 INFO util.ShutdownHookManager: Shutdown hook called 2025-11-12 04:52:58,394 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-7e0279a0-e7f1-4002-a93b-a217deae6472 [root@master conf]# spark-submit --master yarn --class org.apache.spark.examples.SparkPi $SPARK_HOME/examples/jars/spark-examples.jar 2025-11-12 04:53:09,709 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2025-11-12 04:53:09,983 WARN deploy.DependencyUtils: Local jar /opt/module/spark/examples/jars/spark-examples.jar does not exist, skipping. Error: Failed to load class org.apache.spark.examples.SparkPi. 2025-11-12 04:53:09,997 INFO util.ShutdownHookManager: Shutdown hook called 2025-11-12 04:53:09,998 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-ced88073-ab63-4a74-881c-ec43e436e161 [root@master conf]# spark-submit --master yarn --class org.apache.spark.examples.SparkPi $SPARK_HOME/examples/jars/spark-examples_2.12-3.1.1.jar 2025-11-12 04:53:15,900 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2025-11-12 04:53:16,256 INFO spark.SparkContext: Running Spark version 3.1.1 2025-11-12 04:53:16,332 INFO resource.ResourceUtils: ============================================================== 2025-11-12 04:53:16,332 INFO resource.ResourceUtils: No custom resources configured for spark.driver. 2025-11-12 04:53:16,332 INFO resource.ResourceUtils: ============================================================== 2025-11-12 04:53:16,333 INFO spark.SparkContext: Submitted application: Spark Pi 2025-11-12 04:53:16,377 INFO resource.ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0) 2025-11-12 04:53:16,435 INFO resource.ResourceProfile: Limiting resource is cpus at 1 tasks per executor 2025-11-12 04:53:16,437 INFO resource.ResourceProfileManager: Added ResourceProfile id: 0 2025-11-12 04:53:16,566 INFO spark.SecurityManager: Changing view acls to: root 2025-11-12 04:53:16,566 INFO spark.SecurityManager: Changing modify acls to: root 2025-11-12 04:53:16,566 INFO spark.SecurityManager: Changing view acls groups to: 2025-11-12 04:53:16,566 INFO spark.SecurityManager: Changing modify acls groups to: 2025-11-12 04:53:16,566 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set() 2025-11-12 04:53:16,890 INFO util.Utils: Successfully started service 'sparkDriver' on port 33022. 2025-11-12 04:53:16,935 INFO spark.SparkEnv: Registering MapOutputTracker 2025-11-12 04:53:16,980 INFO spark.SparkEnv: Registering BlockManagerMaster 2025-11-12 04:53:17,006 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 2025-11-12 04:53:17,007 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 2025-11-12 04:53:17,094 INFO spark.SparkEnv: Registering BlockManagerMasterHeartbeat 2025-11-12 04:53:17,114 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-6a6708df-05b8-407f-9011-dd935e59e38f 2025-11-12 04:53:17,140 INFO memory.MemoryStore: MemoryStore started with capacity 413.9 MiB 2025-11-12 04:53:17,206 INFO spark.SparkEnv: Registering OutputCommitCoordinator 2025-11-12 04:53:17,346 INFO util.log: Logging initialized @3309ms to org.sparkproject.jetty.util.log.Slf4jLog 2025-11-12 04:53:17,506 INFO server.Server: jetty-9.4.36.v20210114; built: 2021-01-14T16:44:28.689Z; git: 238ec6997c7806b055319a6d11f8ae7564adc0de; jvm 1.8.0_211-b12 2025-11-12 04:53:17,590 INFO server.Server: Started @3554ms 2025-11-12 04:53:17,704 INFO server.AbstractConnector: Started ServerConnector@68ed96ca{HTTP/1.1, (http/1.1)}{0.0.0.0:4040} 2025-11-12 04:53:17,705 INFO util.Utils: Successfully started service 'SparkUI' on port 4040. 2025-11-12 04:53:17,734 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@a23a01d{/jobs,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,736 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@61a5b4ae{/jobs/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,736 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5b69fd74{/jobs/job,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,744 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@63a5e46c{/jobs/job/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,744 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@49ef32e0{/stages,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,745 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6bd51ed8{/stages/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,746 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@51abf713{/stages/stage,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,747 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3fc08eec{/stages/stage/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,748 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7b02e036{/stages/pool,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,748 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1e287667{/stages/pool/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,749 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4201a617{/storage,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,749 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1bb9aa43{/storage/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,750 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@df5f5c0{/storage/rdd,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,751 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@66b72664{/storage/rdd/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,751 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@58cd06cb{/environment,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,752 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@64b31700{/environment/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,753 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@bae47a0{/executors,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,753 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@85ec632{/executors/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,754 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@65ef722a{/executors/threadDump,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,760 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@214894fc{/executors/threadDump/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,772 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@e362c57{/static,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,773 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@24528a25{/,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,775 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@59221b97{/api,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,775 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3ee39da0{/jobs/job/kill,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,776 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7cc9ce8{/stages/stage/kill,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,778 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://master:4040 2025-11-12 04:53:17,821 INFO spark.SparkContext: Added JAR file:/opt/module/spark/examples/jars/spark-examples_2.12-3.1.1.jar at spark://master:33022/jars/spark-examples_2.12-3.1.1.jar with timestamp 1762894396246 2025-11-12 04:53:18,269 INFO client.RMProxy: Connecting to ResourceManager at master/192.168.43.100:8032 2025-11-12 04:53:18,540 INFO yarn.Client: Requesting a new application from cluster with 3 NodeManagers 2025-11-12 04:53:19,385 INFO conf.Configuration: resource-types.xml not found 2025-11-12 04:53:19,385 INFO resource.ResourceUtils: Unable to find 'resource-types.xml'. 2025-11-12 04:53:19,413 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per container) 2025-11-12 04:53:19,413 INFO yarn.Client: Will allocate AM container, with 896 MB memory including 384 MB overhead 2025-11-12 04:53:19,414 INFO yarn.Client: Setting up container launch context for our AM 2025-11-12 04:53:19,414 INFO yarn.Client: Setting up the launch environment for our AM container 2025-11-12 04:53:19,428 INFO yarn.Client: Preparing resources for our AM container 2025-11-12 04:53:19,492 WARN yarn.Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME. 2025-11-12 04:53:21,016 INFO yarn.Client: Uploading resource file:/tmp/spark-9cf7b4b0-d6e0-4258-b2ea-7edb7303c330/__spark_libs__4318401642194911955.zip -> hdfs://master:9000/user/root/.sparkStaging/application_1762894202540_0003/__spark_libs__4318401642194911955.zip 2025-11-12 04:53:24,623 INFO yarn.Client: Uploading resource file:/tmp/spark-9cf7b4b0-d6e0-4258-b2ea-7edb7303c330/__spark_conf__5067439934985698710.zip -> hdfs://master:9000/user/root/.sparkStaging/application_1762894202540_0003/__spark_conf__.zip 2025-11-12 04:53:25,145 INFO spark.SecurityManager: Changing view acls to: root 2025-11-12 04:53:25,145 INFO spark.SecurityManager: Changing modify acls to: root 2025-11-12 04:53:25,145 INFO spark.SecurityManager: Changing view acls groups to: 2025-11-12 04:53:25,145 INFO spark.SecurityManager: Changing modify acls groups to: 2025-11-12 04:53:25,145 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set() 2025-11-12 04:53:25,182 INFO yarn.Client: Submitting application application_1762894202540_0003 to ResourceManager 2025-11-12 04:53:25,232 INFO impl.YarnClientImpl: Submitted application application_1762894202540_0003 2025-11-12 04:53:26,236 INFO yarn.Client: Application report for application_1762894202540_0003 (state: ACCEPTED) 2025-11-12 04:53:26,238 INFO yarn.Client: client token: N/A diagnostics: [星期三 十一月 12 04:53:25 +0800 2025] Scheduler has assigned a container for AM, waiting for AM container to be launched ApplicationMaster host: N/A ApplicationMaster RPC port: -1 queue: default start time: 1762894405197 final status: UNDEFINED tracking URL: http://master:8088/proxy/application_1762894202540_0003/ user: root 2025-11-12 04:53:27,242 INFO yarn.Client: Application report for application_1762894202540_0003 (state: ACCEPTED) 2025-11-12 04:53:28,247 INFO yarn.Client: Application report for application_1762894202540_0003 (state: ACCEPTED) 2025-11-12 04:53:29,250 INFO yarn.Client: Application report for application_1762894202540_0003 (state: ACCEPTED) 2025-11-12 04:53:30,253 INFO yarn.Client: Application report for application_1762894202540_0003 (state: ACCEPTED) 2025-11-12 04:53:31,257 INFO yarn.Client: Application report for application_1762894202540_0003 (state: ACCEPTED) ^C2025-11-12 04:53:32,204 INFO storage.DiskBlockManager: Shutdown hook called 2025-11-12 04:53:32,211 INFO util.ShutdownHookManager: Shutdown hook called 2025-11-12 04:53:32,211 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-9cf7b4b0-d6e0-4258-b2ea-7edb7303c330 2025-11-12 04:53:32,215 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-b7982e3f-5f51-4b89-8b11-120862f2a08c 2025-11-12 04:53:32,216 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-9cf7b4b0-d6e0-4258-b2ea-7edb7303c330/userFiles-3eb53278-d265-42c2-9531-d8811165f34d [root@master conf]# spark-submit --master yarn --class org.apache.spark.examples.SparkPi $SPARK_HOME/examples/jars/spark-examples_2.12-3.1.1.jar 25/11/12 04:52:13 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 25/11/12 04:52:14 INFO SparkContext: Running Spark version 3.1.1 25/11/12 04:52:14 INFO ResourceUtils: ============================================================== 25/11/12 04:52:14 INFO ResourceUtils: No custom resources configured for spark.driver. 25/11/12 04:52:14 INFO ResourceUtils: ============================================================== 25/11/12 04:52:14 INFO SparkContext: Submitted application: Spark Pi 25/11/12 04:52:14 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0) 25/11/12 04:52:14 INFO ResourceProfile: Limiting resource is cpus at 1 tasks per executor 25/11/12 04:52:14 INFO ResourceProfileManager: Added ResourceProfile id: 0 25/11/12 04:52:14 INFO SecurityManager: Changing view acls to: root 25/11/12 04:52:14 INFO SecurityManager: Changing modify acls to: root 25/11/12 04:52:14 INFO SecurityManager: Changing view acls groups to: 25/11/12 04:52:14 INFO SecurityManager: Changing modify acls groups to: 25/11/12 04:52:14 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set() 25/11/12 04:52:15 INFO Utils: Successfully started service 'sparkDriver' on port 34815. 25/11/12 04:52:15 INFO SparkEnv: Registering MapOutputTracker 25/11/12 04:52:15 INFO SparkEnv: Registering BlockManagerMaster 25/11/12 04:52:15 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 25/11/12 04:52:15 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 25/11/12 04:52:15 INFO SparkEnv: Registering BlockManagerMasterHeartbeat 25/11/12 04:52:15 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-3940abd9-f976-4673-ac49-5bffc32a9ec4 25/11/12 04:52:15 INFO MemoryStore: MemoryStore started with capacity 413.9 MiB 25/11/12 04:52:15 INFO SparkEnv: Registering OutputCommitCoordinator 25/11/12 04:52:15 INFO Utils: Successfully started service 'SparkUI' on port 4040. 25/11/12 04:52:15 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://master:4040 25/11/12 04:52:15 INFO SparkContext: Added JAR file:/opt/module/spark/examples/jars/spark-examples_2.12-3.1.1.jar at spark://master:34815/jars/spark-examples_2.12-3.1.1.jar with timestamp 1762894334373 25/11/12 04:52:16 INFO RMProxy: Connecting to ResourceManager at /0.0.0.0:8032 25/11/12 04:52:16 INFO Client: Requesting a new application from cluster with 3 NodeManagers 25/11/12 04:52:17 INFO Configuration: resource-types.xml not found 25/11/12 04:52:17 INFO ResourceUtils: Unable to find 'resource-types.xml'. 25/11/12 04:52:17 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per container) 25/11/12 04:52:17 INFO Client: Will allocate AM container, with 896 MB memory including 384 MB overhead 25/11/12 04:52:17 INFO Client: Setting up container launch context for our AM 25/11/12 04:52:17 INFO Client: Setting up the launch environment for our AM container 25/11/12 04:52:17 INFO Client: Preparing resources for our AM container 25/11/12 04:52:17 WARN Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME. 25/11/12 04:52:18 INFO Client: Uploading resource file:/tmp/spark-0eb147ed-0ede-4f24-90b9-65a8f65aa65b/__spark_libs__1298849544083591332.zip -> file:/root/.sparkStaging/application_1762894202540_0002/__spark_libs__1298849544083591332.zip 25/11/12 04:52:19 INFO Client: Uploading resource file:/tmp/spark-0eb147ed-0ede-4f24-90b9-65a8f65aa65b/__spark_conf__2273853142442443753.zip -> file:/root/.sparkStaging/application_1762894202540_0002/__spark_conf__.zip 25/11/12 04:52:19 INFO SecurityManager: Changing view acls to: root 25/11/12 04:52:19 INFO SecurityManager: Changing modify acls to: root 25/11/12 04:52:19 INFO SecurityManager: Changing view acls groups to: 25/11/12 04:52:19 INFO SecurityManager: Changing modify acls groups to: 25/11/12 04:52:19 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set() 25/11/12 04:52:19 INFO Client: Submitting application application_1762894202540_0002 to ResourceManager 25/11/12 04:52:20 INFO YarnClientImpl: Submitted application application_1762894202540_0002 25/11/12 04:52:21 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) 25/11/12 04:52:21 INFO Client: client token: N/A diagnostics: [星期三 十一月 12 04:52:20 +0800 2025] Scheduler has assigned a container for AM, waiting for AM container to be launched ApplicationMaster host: N/A ApplicationMaster RPC port: -1 queue: default start time: 1762894339943 final status: UNDEFINED tracking URL: http://master:8088/proxy/application_1762894202540_0002/ user: root 25/11/12 04:52:22 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) 25/11/12 04:52:23 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) 25/11/12 04:52:24 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) 25/11/12 04:52:25 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) 25/11/12 04:52:26 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) :25/11/12 04:52:27 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) 25/11/12 04:52:28 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) 25/11/12 04:52:29 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) ^C25/11/12 04:52:30 INFO DiskBlockManager: Shutdown hook called 25/11/12 04:52:30 INFO ShutdownHookManager: Shutdown hook called 25/11/12 04:52:30 INFO ShutdownHookManager: Deleting directory /tmp/spark-c213ed8f-d019-49b8-af3c-48d7d225c929 25/11/12 04:52:30 INFO ShutdownHookManager: Deleting directory /tmp/spark-0eb147ed-0ede-4f24-90b9-65a8f65aa65b/userFiles-0c002819-b899-477a-8142-ea070ae03495 25/11/12 04:52:30 INFO ShutdownHookManager: Deleting directory /tmp/spark-0eb147ed-0ede-4f24-90b9-65a8f65aa65b [root@master conf]# vi spark-env.sh [root@master conf]# spark-submit --master yarn --class org.apache.spark.examples.SparkPi $SPARK_HOME/examples/jars/spark-examples_2.12.jar 2025-11-12 04:52:57,994 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2025-11-12 04:52:58,381 WARN deploy.DependencyUtils: Local jar /opt/module/spark/examples/jars/spark-examples_2.12.jar does not exist, skipping. Error: Failed to load class org.apache.spark.examples.SparkPi. 2025-11-12 04:52:58,394 INFO util.ShutdownHookManager: Shutdown hook called 2025-11-12 04:52:58,394 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-7e0279a0-e7f1-4002-a93b-a217deae6472 [root@master conf]# spark-submit --master yarn --class org.apache.spark.examples.SparkPi $SPARK_HOME/examples/jars/spark-examples.jar 2025-11-12 04:53:09,709 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2025-11-12 04:53:09,983 WARN deploy.DependencyUtils: Local jar /opt/module/spark/examples/jars/spark-examples.jar does not exist, skipping. Error: Failed to load class org.apache.spark.examples.SparkPi. 2025-11-12 04:53:09,997 INFO util.ShutdownHookManager: Shutdown hook called 2025-11-12 04:53:09,998 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-ced88073-ab63-4a74-881c-ec43e436e161 [root@master conf]# spark-submit --master yarn --class org.apache.spark.examples.SparkPi $SPARK_HOME/examples/jars/spark-examples_2.12-3.1.1.jar 2025-11-12 04:53:15,900 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2025-11-12 04:53:16,256 INFO spark.SparkContext: Running Spark version 3.1.1 2025-11-12 04:53:16,332 INFO resource.ResourceUtils: ============================================================== 2025-11-12 04:53:16,332 INFO resource.ResourceUtils: No custom resources configured for spark.driver. 2025-11-12 04:53:16,332 INFO resource.ResourceUtils: ============================================================== 2025-11-12 04:53:16,333 INFO spark.SparkContext: Submitted application: Spark Pi 2025-11-12 04:53:16,377 INFO resource.ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0) 2025-11-12 04:53:16,435 INFO resource.ResourceProfile: Limiting resource is cpus at 1 tasks per executor 2025-11-12 04:53:16,437 INFO resource.ResourceProfileManager: Added ResourceProfile id: 0 2025-11-12 04:53:16,566 INFO spark.SecurityManager: Changing view acls to: root 2025-11-12 04:53:16,566 INFO spark.SecurityManager: Changing modify acls to: root 2025-11-12 04:53:16,566 INFO spark.SecurityManager: Changing view acls groups to: 2025-11-12 04:53:16,566 INFO spark.SecurityManager: Changing modify acls groups to: 2025-11-12 04:53:16,566 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set() 2025-11-12 04:53:16,890 INFO util.Utils: Successfully started service 'sparkDriver' on port 33022. 2025-11-12 04:53:16,935 INFO spark.SparkEnv: Registering MapOutputTracker 2025-11-12 04:53:16,980 INFO spark.SparkEnv: Registering BlockManagerMaster 2025-11-12 04:53:17,006 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 2025-11-12 04:53:17,007 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 2025-11-12 04:53:17,094 INFO spark.SparkEnv: Registering BlockManagerMasterHeartbeat 2025-11-12 04:53:17,114 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-6a6708df-05b8-407f-9011-dd935e59e38f 2025-11-12 04:53:17,140 INFO memory.MemoryStore: MemoryStore started with capacity 413.9 MiB 2025-11-12 04:53:17,206 INFO spark.SparkEnv: Registering OutputCommitCoordinator 2025-11-12 04:53:17,346 INFO util.log: Logging initialized @3309ms to org.sparkproject.jetty.util.log.Slf4jLog 2025-11-12 04:53:17,506 INFO server.Server: jetty-9.4.36.v20210114; built: 2021-01-14T16:44:28.689Z; git: 238ec6997c7806b055319a6d11f8ae7564adc0de; jvm 1.8.0_211-b12 2025-11-12 04:53:17,590 INFO server.Server: Started @3554ms 2025-11-12 04:53:17,704 INFO server.AbstractConnector: Started ServerConnector@68ed96ca{HTTP/1.1, (http/1.1)}{0.0.0.0:4040} 2025-11-12 04:53:17,705 INFO util.Utils: Successfully started service 'SparkUI' on port 4040. 2025-11-12 04:53:17,734 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@a23a01d{/jobs,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,736 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@61a5b4ae{/jobs/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,736 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5b69fd74{/jobs/job,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,744 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@63a5e46c{/jobs/job/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,744 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@49ef32e0{/stages,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,745 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6bd51ed8{/stages/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,746 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@51abf713{/stages/stage,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,747 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3fc08eec{/stages/stage/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,748 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7b02e036{/stages/pool,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,748 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1e287667{/stages/pool/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,749 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4201a617{/storage,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,749 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1bb9aa43{/storage/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,750 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@df5f5c0{/storage/rdd,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,751 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@66b72664{/storage/rdd/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,751 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@58cd06cb{/environment,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,752 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@64b31700{/environment/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,753 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@bae47a0{/executors,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,753 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@85ec632{/executors/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,754 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@65ef722a{/executors/threadDump,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,760 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@214894fc{/executors/threadDump/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,772 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@e362c57{/static,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,773 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@24528a25{/,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,775 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@59221b97{/api,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,775 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3ee39da0{/jobs/job/kill,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,776 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7cc9ce8{/stages/stage/kill,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,778 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://master:4040 2025-11-12 04:53:17,821 INFO spark.SparkContext: Added JAR file:/opt/module/spark/examples/jars/spark-examples_2.12-3.1.1.jar at spark://master:33022/jars/spark-examples_2.12-3.1.1.jar with timestamp 1762894396246 2025-11-12 04:53:18,269 INFO client.RMProxy: Connecting to ResourceManager at master/192.168.43.100:8032 2025-11-12 04:53:18,540 INFO yarn.Client: Requesting a new application from cluster with 3 NodeManagers 2025-11-12 04:53:19,385 INFO conf.Configuration: resource-types.xml not found 2025-11-12 04:53:19,385 INFO resource.ResourceUtils: Unable to find 'resource-types.xml'. 2025-11-12 04:53:19,413 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per container) 2025-11-12 04:53:19,413 INFO yarn.Client: Will allocate AM container, with 896 MB memory including 384 MB overhead 2025-11-12 04:53:19,414 INFO yarn.Client: Setting up container launch context for our AM 2025-11-12 04:53:19,414 INFO yarn.Client: Setting up the launch environment for our AM container 2025-11-12 04:53:19,428 INFO yarn.Client: Preparing resources for our AM container 2025-11-12 04:53:19,492 WARN yarn.Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME. 2025-11-12 04:53:21,016 INFO yarn.Client: Uploading resource file:/tmp/spark-9cf7b4b0-d6e0-4258-b2ea-7edb7303c330/__spark_libs__4318401642194911955.zip -> hdfs://master:9000/user/root/.sparkStaging/application_1762894202540_0003/__spark_libs__4318401642194911955.zip 2025-11-12 04:53:24,623 INFO yarn.Client: Uploading resource file:/tmp/spark-9cf7b4b0-d6e0-4258-b2ea-7edb7303c330/__spark_conf__5067439934985698710.zip -> hdfs://master:9000/user/root/.sparkStaging/application_1762894202540_0003/__spark_conf__.zip 2025-11-12 04:53:25,145 INFO spark.SecurityManager: Changing view acls to: root 2025-11-12 04:53:25,145 INFO spark.SecurityManager: Changing modify acls to: root 2025-11-12 04:53:25,145 INFO spark.SecurityManager: Changing view acls groups to: 2025-11-12 04:53:25,145 INFO spark.SecurityManager: Changing modify acls groups to: 2025-11-12 04:53:25,145 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set() 2025-11-12 04:53:25,182 INFO yarn.Client: Submitting application application_1762894202540_0003 to ResourceManager 2025-11-12 04:53:25,232 INFO impl.YarnClientImpl: Submitted application application_1762894202540_0003 2025-11-12 04:53:26,236 INFO yarn.Client: Application report for application_1762894202540_0003 (state: ACCEPTED) 2025-11-12 04:53:26,238 INFO yarn.Client: client token: N/A diagnostics: [星期三 十一月 12 04:53:25 +0800 2025] Scheduler has assigned a container for AM, waiting for AM container to be launched ApplicationMaster host: N/A ApplicationMaster RPC port: -1 queue: default start time: 1762894405197 final status: UNDEFINED tracking URL: http://master:8088/proxy/application_1762894202540_0003/ user: root 2025-11-12 04:53:27,242 INFO yarn.Client: Application report for application_1762894202540_0003 (state: ACCEPTED) 2025-11-12 04:53:28,247 INFO yarn.Client: Application report for application_1762894202540_0003 (state: ACCEPTED) 2025-11-12 04:53:29,250 INFO yarn.Client: Application report for application_1762894202540_0003 (state: ACCEPTED) 2025-11-12 04:53:30,253 INFO yarn.Client: Application report for application_1762894202540_0003 (state: ACCEPTED) 2025-11-12 04:53:31,257 INFO yarn.Client: Application report for application_1762894202540_0003 (state: ACCEPTED) ^C2025-11-12 04:53:32,204 INFO storage.DiskBlockManager: Shutdown hook called 2025-11-12 04:53:32,211 INFO util.ShutdownHookManager: Shutdown hook called 2025-11-12 04:53:32,211 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-9cf7b4b0-d6e0-4258-b2ea-7edb7303c330 2025-11-12 04:53:32,215 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-b7982e3f-5f51-4b89-8b11-120862f2a08c 2025-11-12 04:53:32,216 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-9cf7b4b0-d6e0-4258-b2ea-7edb7303c330/userFiles-3eb53278-d265-42c2-9531-d8811165f34d
最新发布
11-13
评论 2
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值