hadoop http://192.168.20.114:50070 无法访问

本文介绍了当遇到Hadoop HDFS服务无法通过http://192.168.20.114:50070访问的问题时的解决办法。主要原因是文件系统未被格式化。文中提供了一个简单的步骤来格式化文件系统,即在执行start-all.sh之前使用bin/hadoop namenode format命令。

hadoop http://192.168.20.114:50070 无法访问 表示HDFS服务没有启动。

原因很多,我的原因是没有格式化文件系统。


在 ./bin/start-all.sh 前面使用  bin/hadoop namenode -format 格式化文件熊HDFS.


hdfs namenode -format 2025-10-10 20:45:17,586 INFO namenode.NameNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting NameNode STARTUP_MSG: host = master/192.168.56.130 STARTUP_MSG: args = [-format] STARTUP_MSG: version = 3.1.3 STARTUP_MSG: classpath = /opt/soft/hadoop/etc/hadoop:/opt/soft/hadoop/share/hadoop/common/lib/accessors-smart-1.2.jar:/opt/soft/hadoop/share/hadoop/common/lib/animal-sniffer-annotations-1.17.jar:/opt/soft/hadoop/share/hadoop/common/lib/asm-5.0.4.jar:/opt/soft/hadoop/share/hadoop/common/lib/audience-annotations-0.5.0.jar:/opt/soft/hadoop/share/hadoop/common/lib/avro-1.7.7.jar:/opt/soft/hadoop/share/hadoop/common/lib/checker-qual-2.5.2.jar:/opt/soft/hadoop/share/hadoop/common/lib/commons-beanutils-1.9.3.jar:/opt/soft/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/opt/soft/hadoop/share/hadoop/common/lib/commons-codec-1.11.jar:/opt/soft/hadoop/share/hadoop/common/lib/commons-collections-3.2.2.jar:/opt/soft/hadoop/share/hadoop/common/lib/commons-compress-1.18.jar:/opt/soft/hadoop/share/hadoop/common/lib/commons-configuration2-2.1.1.jar:/opt/soft/hadoop/share/hadoop/common/lib/commons-io-2.5.jar:/opt/soft/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/opt/soft/hadoop/share/hadoop/common/lib/commons-lang3-3.4.jar:/opt/soft/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/opt/soft/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/opt/soft/hadoop/share/hadoop/common/lib/commons-net-3.6.jar:/opt/soft/hadoop/share/hadoop/common/lib/curator-client-2.13.0.jar:/opt/soft/hadoop/share/hadoop/common/lib/curator-framework-2.13.0.jar:/opt/soft/hadoop/share/hadoop/common/lib/curator-recipes-2.13.0.jar:/opt/soft/hadoop/share/hadoop/common/lib/error_prone_annotations-2.2.0.jar:/opt/soft/hadoop/share/hadoop/common/lib/failureaccess-1.0.jar:/opt/soft/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/opt/soft/hadoop/share/hadoop/common/lib/guava-27.0-jre.jar:/opt/soft/hadoop/share/hadoop/common/lib/hadoop-annotations-3.1.3.jar:/opt/soft/hadoop/share/hadoop/common/lib/hadoop-auth-3.1.3.jar:/opt/soft/hadoop/share/hadoop/common/lib/htrace-core4-4.1.0-incubating.jar:/opt/soft/hadoop/share/hadoop/common/lib/httpclient-4.5.2.jar:/opt/soft/hadoop/share/hadoop/common/lib/httpcore-4.4.4.jar:/opt/soft/hadoop/share/hadoop/common/lib/j2objc-annotations-1.1.jar:/opt/soft/hadoop/share/hadoop/common/lib/jackson-annotations-2.7.8.jar:/opt/soft/hadoop/share/hadoop/common/lib/jackson-core-2.7.8.jar:/opt/soft/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/opt/soft/hadoop/share/hadoop/common/lib/jackson-databind-2.7.8.jar:/opt/soft/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/opt/soft/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/opt/soft/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/opt/soft/hadoop/share/hadoop/common/lib/javax.servlet-api-3.1.0.jar:/opt/soft/hadoop/share/hadoop/common/lib/jaxb-api-2.2.11.jar:/opt/soft/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/opt/soft/hadoop/share/hadoop/common/lib/jcip-annotations-1.0-1.jar:/opt/soft/hadoop/share/hadoop/common/lib/jersey-core-1.19.jar:/opt/soft/hadoop/share/hadoop/common/lib/jersey-json-1.19.jar:/opt/soft/hadoop/share/hadoop/common/lib/jersey-server-1.19.jar:/opt/soft/hadoop/share/hadoop/common/lib/jersey-servlet-1.19.jar:/opt/soft/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/opt/soft/hadoop/share/hadoop/common/lib/jetty-http-9.3.24.v20180605.jar:/opt/soft/hadoop/share/hadoop/common/lib/jetty-io-9.3.24.v20180605.jar:/opt/soft/hadoop/share/hadoop/common/lib/jetty-security-9.3.24.v20180605.jar:/opt/soft/hadoop/share/hadoop/common/lib/jetty-server-9.3.24.v20180605.jar:/opt/soft/hadoop/share/hadoop/common/lib/jetty-servlet-9.3.24.v20180605.jar:/opt/soft/hadoop/share/hadoop/common/lib/jetty-util-9.3.24.v20180605.jar:/opt/soft/hadoop/share/hadoop/common/lib/jetty-webapp-9.3.24.v20180605.jar:/opt/soft/hadoop/share/hadoop/common/lib/jetty-xml-9.3.24.v20180605.jar:/opt/soft/hadoop/share/hadoop/common/lib/jsch-0.1.54.jar:/opt/soft/hadoop/share/hadoop/common/lib/json-smart-2.3.jar:/opt/soft/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/opt/soft/hadoop/share/hadoop/common/lib/jsr305-3.0.0.jar:/opt/soft/hadoop/share/hadoop/common/lib/jsr311-api-1.1.1.jar:/opt/soft/hadoop/share/hadoop/common/lib/kerb-admin-1.0.1.jar:/opt/soft/hadoop/share/hadoop/common/lib/kerb-client-1.0.1.jar:/opt/soft/hadoop/share/hadoop/common/lib/kerb-common-1.0.1.jar:/opt/soft/hadoop/share/hadoop/common/lib/kerb-core-1.0.1.jar:/opt/soft/hadoop/share/hadoop/common/lib/kerb-crypto-1.0.1.jar:/opt/soft/hadoop/share/hadoop/common/lib/kerb-identity-1.0.1.jar:/opt/soft/hadoop/share/hadoop/common/lib/kerb-server-1.0.1.jar:/opt/soft/hadoop/share/hadoop/common/lib/kerb-simplekdc-1.0.1.jar:/opt/soft/hadoop/share/hadoop/common/lib/kerb-util-1.0.1.jar:/opt/soft/hadoop/share/hadoop/common/lib/kerby-asn1-1.0.1.jar:/opt/soft/hadoop/share/hadoop/common/lib/kerby-config-1.0.1.jar:/opt/soft/hadoop/share/hadoop/common/lib/kerby-pkix-1.0.1.jar:/opt/soft/hadoop/share/hadoop/common/lib/kerby-util-1.0.1.jar:/opt/soft/hadoop/share/hadoop/common/lib/kerby-xdr-1.0.1.jar:/opt/soft/hadoop/share/hadoop/common/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/soft/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/opt/soft/hadoop/share/hadoop/common/lib/netty-3.10.5.Final.jar:/opt/soft/hadoop/share/hadoop/common/lib/nimbus-jose-jwt-4.41.1.jar:/opt/soft/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/opt/soft/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/opt/soft/hadoop/share/hadoop/common/lib/re2j-1.1.jar:/opt/soft/hadoop/share/hadoop/common/lib/slf4j-api-1.7.25.jar:/opt/soft/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar:/opt/soft/hadoop/share/hadoop/common/lib/snappy-java-1.0.5.jar:/opt/soft/hadoop/share/hadoop/common/lib/stax2-api-3.1.4.jar:/opt/soft/hadoop/share/hadoop/common/lib/token-provider-1.0.1.jar:/opt/soft/hadoop/share/hadoop/common/lib/woodstox-core-5.0.3.jar:/opt/soft/hadoop/share/hadoop/common/lib/zookeeper-3.4.13.jar:/opt/soft/hadoop/share/hadoop/common/lib/jul-to-slf4j-1.7.25.jar:/opt/soft/hadoop/share/hadoop/common/lib/metrics-core-3.2.4.jar:/opt/soft/hadoop/share/hadoop/common/hadoop-common-3.1.3-tests.jar:/opt/soft/hadoop/share/hadoop/common/hadoop-common-3.1.3.jar:/opt/soft/hadoop/share/hadoop/common/hadoop-nfs-3.1.3.jar:/opt/soft/hadoop/share/hadoop/common/hadoop-kms-3.1.3.jar:/opt/soft/hadoop/share/hadoop/hdfs:/opt/soft/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/jetty-util-ajax-9.3.24.v20180605.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/leveldbjni-all-1.8.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/netty-all-4.0.52.Final.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/okhttp-2.7.5.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/okio-1.6.0.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/jersey-servlet-1.19.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/jersey-json-1.19.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/hadoop-auth-3.1.3.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/commons-codec-1.11.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/httpclient-4.5.2.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/httpcore-4.4.4.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/nimbus-jose-jwt-4.41.1.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/jcip-annotations-1.0-1.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/json-smart-2.3.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/accessors-smart-1.2.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/asm-5.0.4.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/zookeeper-3.4.13.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/audience-annotations-0.5.0.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/netty-3.10.5.Final.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/curator-framework-2.13.0.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/curator-client-2.13.0.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/guava-27.0-jre.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/failureaccess-1.0.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/jsr305-3.0.0.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/checker-qual-2.5.2.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/error_prone_annotations-2.2.0.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/j2objc-annotations-1.1.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/animal-sniffer-annotations-1.17.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/kerb-simplekdc-1.0.1.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/kerb-client-1.0.1.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/kerby-config-1.0.1.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/kerb-core-1.0.1.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/kerby-pkix-1.0.1.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/kerby-asn1-1.0.1.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/kerby-util-1.0.1.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/kerb-common-1.0.1.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/kerb-crypto-1.0.1.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/commons-io-2.5.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/kerb-util-1.0.1.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/token-provider-1.0.1.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/kerb-admin-1.0.1.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/kerb-server-1.0.1.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/kerb-identity-1.0.1.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/kerby-xdr-1.0.1.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/jersey-core-1.19.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/jsr311-api-1.1.1.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/jersey-server-1.19.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/javax.servlet-api-3.1.0.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/json-simple-1.1.1.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/jetty-server-9.3.24.v20180605.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/jetty-http-9.3.24.v20180605.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/jetty-util-9.3.24.v20180605.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/jetty-io-9.3.24.v20180605.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/jetty-webapp-9.3.24.v20180605.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/jetty-xml-9.3.24.v20180605.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/jetty-servlet-9.3.24.v20180605.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/jetty-security-9.3.24.v20180605.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/hadoop-annotations-3.1.3.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/commons-math3-3.1.1.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/commons-net-3.6.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/commons-collections-3.2.2.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/jettison-1.1.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/jaxb-impl-2.2.3-1.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/jaxb-api-2.2.11.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/jackson-jaxrs-1.9.13.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/jackson-xc-1.9.13.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/commons-beanutils-1.9.3.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/commons-configuration2-2.1.1.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/commons-lang3-3.4.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/avro-1.7.7.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/paranamer-2.3.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/snappy-java-1.0.5.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/commons-compress-1.18.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/re2j-1.1.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/gson-2.2.4.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/jsch-0.1.54.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/curator-recipes-2.13.0.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/htrace-core4-4.1.0-incubating.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/jackson-databind-2.7.8.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/jackson-annotations-2.7.8.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/jackson-core-2.7.8.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/stax2-api-3.1.4.jar:/opt/soft/hadoop/share/hadoop/hdfs/lib/woodstox-core-5.0.3.jar:/opt/soft/hadoop/share/hadoop/hdfs/hadoop-hdfs-3.1.3-tests.jar:/opt/soft/hadoop/share/hadoop/hdfs/hadoop-hdfs-3.1.3.jar:/opt/soft/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-3.1.3.jar:/opt/soft/hadoop/share/hadoop/hdfs/hadoop-hdfs-client-3.1.3-tests.jar:/opt/soft/hadoop/share/hadoop/hdfs/hadoop-hdfs-client-3.1.3.jar:/opt/soft/hadoop/share/hadoop/hdfs/hadoop-hdfs-native-client-3.1.3-tests.jar:/opt/soft/hadoop/share/hadoop/hdfs/hadoop-hdfs-native-client-3.1.3.jar:/opt/soft/hadoop/share/hadoop/hdfs/hadoop-hdfs-rbf-3.1.3-tests.jar:/opt/soft/hadoop/share/hadoop/hdfs/hadoop-hdfs-rbf-3.1.3.jar:/opt/soft/hadoop/share/hadoop/hdfs/hadoop-hdfs-httpfs-3.1.3.jar:/opt/soft/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/opt/soft/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/opt/soft/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-3.1.3.jar:/opt/soft/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-3.1.3.jar:/opt/soft/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-3.1.3.jar:/opt/soft/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-3.1.3.jar:/opt/soft/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-3.1.3.jar:/opt/soft/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-3.1.3-tests.jar:/opt/soft/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-3.1.3.jar:/opt/soft/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-nativetask-3.1.3.jar:/opt/soft/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-3.1.3.jar:/opt/soft/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-uploader-3.1.3.jar:/opt/soft/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-3.1.3.jar:/opt/soft/hadoop/share/hadoop/yarn:/opt/soft/hadoop/share/hadoop/yarn/lib/HikariCP-java7-2.4.12.jar:/opt/soft/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/opt/soft/hadoop/share/hadoop/yarn/lib/dnsjava-2.1.7.jar:/opt/soft/hadoop/share/hadoop/yarn/lib/ehcache-3.3.1.jar:/opt/soft/hadoop/share/hadoop/yarn/lib/fst-2.50.jar:/opt/soft/hadoop/share/hadoop/yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/opt/soft/hadoop/share/hadoop/yarn/lib/guice-4.0.jar:/opt/soft/hadoop/share/hadoop/yarn/lib/guice-servlet-4.0.jar:/opt/soft/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-base-2.7.8.jar:/opt/soft/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-json-provider-2.7.8.jar:/opt/soft/hadoop/share/hadoop/yarn/lib/jackson-module-jaxb-annotations-2.7.8.jar:/opt/soft/hadoop/share/hadoop/yarn/lib/java-util-1.9.0.jar:/opt/soft/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/opt/soft/hadoop/share/hadoop/yarn/lib/jersey-client-1.19.jar:/opt/soft/hadoop/share/hadoop/yarn/lib/jersey-guice-1.19.jar:/opt/soft/hadoop/share/hadoop/yarn/lib/json-io-2.5.1.jar:/opt/soft/hadoop/share/hadoop/yarn/lib/metrics-core-3.2.4.jar:/opt/soft/hadoop/share/hadoop/yarn/lib/mssql-jdbc-6.2.1.jre7.jar:/opt/soft/hadoop/share/hadoop/yarn/lib/objenesis-1.0.jar:/opt/soft/hadoop/share/hadoop/yarn/lib/snakeyaml-1.16.jar:/opt/soft/hadoop/share/hadoop/yarn/lib/swagger-annotations-1.5.4.jar:/opt/soft/hadoop/share/hadoop/yarn/hadoop-yarn-api-3.1.3.jar:/opt/soft/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-3.1.3.jar:/opt/soft/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-3.1.3.jar:/opt/soft/hadoop/share/hadoop/yarn/hadoop-yarn-client-3.1.3.jar:/opt/soft/hadoop/share/hadoop/yarn/hadoop-yarn-common-3.1.3.jar:/opt/soft/hadoop/share/hadoop/yarn/hadoop-yarn-registry-3.1.3.jar:/opt/soft/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-3.1.3.jar:/opt/soft/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-3.1.3.jar:/opt/soft/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-3.1.3.jar:/opt/soft/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-3.1.3.jar:/opt/soft/hadoop/share/hadoop/yarn/hadoop-yarn-server-router-3.1.3.jar:/opt/soft/hadoop/share/hadoop/yarn/hadoop-yarn-server-sharedcachemanager-3.1.3.jar:/opt/soft/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-3.1.3.jar:/opt/soft/hadoop/share/hadoop/yarn/hadoop-yarn-server-timeline-pluginstorage-3.1.3.jar:/opt/soft/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-3.1.3.jar:/opt/soft/hadoop/share/hadoop/yarn/hadoop-yarn-services-api-3.1.3.jar:/opt/soft/hadoop/share/hadoop/yarn/hadoop-yarn-services-core-3.1.3.jar STARTUP_MSG: build = https://gitbox.apache.org/repos/asf/hadoop.git -r ba631c436b806728f8ec2f54ab1e289526c90579; compiled by 'ztang' on 2019-09-12T02:47Z STARTUP_MSG: java = 1.8.0_451 ************************************************************/ 2025-10-10 20:45:17,599 INFO namenode.NameNode: registered UNIX signal handlers for [TERM, HUP, INT] 2025-10-10 20:45:17,733 INFO namenode.NameNode: createNameNode [-format] 2025-10-10 20:45:18,105 ERROR conf.Configuration: error parsing conf mapred-site.xml com.ctc.wstx.exc.WstxParsingException: Unexpected close tag </property>; expected </value>. at [row,col,system-id]: [23,11,"file:/opt/soft/hadoop/etc/hadoop/mapred-site.xml"] at com.ctc.wstx.sr.StreamScanner.constructWfcException(StreamScanner.java:621) at com.ctc.wstx.sr.StreamScanner.throwParseError(StreamScanner.java:491) at com.ctc.wstx.sr.StreamScanner.throwParseError(StreamScanner.java:475) at com.ctc.wstx.sr.BasicStreamReader.reportWrongEndElem(BasicStreamReader.java:3365) at com.ctc.wstx.sr.BasicStreamReader.readEndElem(BasicStreamReader.java:3292) at com.ctc.wstx.sr.BasicStreamReader.nextFromTree(BasicStreamReader.java:2911) at com.ctc.wstx.sr.BasicStreamReader.next(BasicStreamReader.java:1123) at org.apache.hadoop.conf.Configuration$Parser.parseNext(Configuration.java:3320) at org.apache.hadoop.conf.Configuration$Parser.parse(Configuration.java:3114) at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3007) at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2968) at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2848) at org.apache.hadoop.conf.Configuration.get(Configuration.java:1200) at org.apache.hadoop.conf.Configuration.getTrimmed(Configuration.java:1254) at org.apache.hadoop.conf.Configuration.getLong(Configuration.java:1532) at org.apache.hadoop.security.Groups.<init>(Groups.java:113) at org.apache.hadoop.security.Groups.<init>(Groups.java:102) at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:451) at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:336) at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:303) at org.apache.hadoop.security.UserGroupInformation.isAuthenticationMethodEnabled(UserGroupInformation.java:391) at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:385) at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1156) at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1645) at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1755) 2025-10-10 20:45:18,114 ERROR namenode.NameNode: Failed to start namenode. java.lang.RuntimeException: com.ctc.wstx.exc.WstxParsingException: Unexpected close tag </property>; expected </value>. at [row,col,system-id]: [23,11,"file:/opt/soft/hadoop/etc/hadoop/mapred-site.xml"] at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3024) at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2968) at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2848) at org.apache.hadoop.conf.Configuration.get(Configuration.java:1200) at org.apache.hadoop.conf.Configuration.getTrimmed(Configuration.java:1254) at org.apache.hadoop.conf.Configuration.getLong(Configuration.java:1532) at org.apache.hadoop.security.Groups.<init>(Groups.java:113) at org.apache.hadoop.security.Groups.<init>(Groups.java:102) at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:451) at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:336) at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:303) at org.apache.hadoop.security.UserGroupInformation.isAuthenticationMethodEnabled(UserGroupInformation.java:391) at org.apache.hadoop.security.UserGroupInformation.isSecurityEnabled(UserGroupInformation.java:385) at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:1156) at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1645) at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1755) Caused by: com.ctc.wstx.exc.WstxParsingException: Unexpected close tag </property>; expected </value>. at [row,col,system-id]: [23,11,"file:/opt/soft/hadoop/etc/hadoop/mapred-site.xml"] at com.ctc.wstx.sr.StreamScanner.constructWfcException(StreamScanner.java:621) at com.ctc.wstx.sr.StreamScanner.throwParseError(StreamScanner.java:491) at com.ctc.wstx.sr.StreamScanner.throwParseError(StreamScanner.java:475) at com.ctc.wstx.sr.BasicStreamReader.reportWrongEndElem(BasicStreamReader.java:3365) at com.ctc.wstx.sr.BasicStreamReader.readEndElem(BasicStreamReader.java:3292) at com.ctc.wstx.sr.BasicStreamReader.nextFromTree(BasicStreamReader.java:2911) at com.ctc.wstx.sr.BasicStreamReader.next(BasicStreamReader.java:1123) at org.apache.hadoop.conf.Configuration$Parser.parseNext(Configuration.java:3320) at org.apache.hadoop.conf.Configuration$Parser.parse(Configuration.java:3114) at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3007) ... 15 more 2025-10-10 20:45:18,127 INFO util.ExitUtil: Exiting with status 1: java.lang.RuntimeException: com.ctc.wstx.exc.WstxParsingException: Unexpected close tag </property>; expected </value>. at [row,col,system-id]: [23,11,"file:/opt/soft/hadoop/etc/hadoop/mapred-site.xml"] 2025-10-10 20:45:18,138 INFO namenode.NameNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down NameNode at master/192.168.56.130 ************************************************************/ 2025-10-10 20:45:18,169 ERROR conf.Configuration: error parsing conf mapred-site.xml com.ctc.wstx.exc.WstxParsingException: Unexpected close tag </property>; expected </value>. at [row,col,system-id]: [23,11,"file:/opt/soft/hadoop/etc/hadoop/mapred-site.xml"] at com.ctc.wstx.sr.StreamScanner.constructWfcException(StreamScanner.java:621) at com.ctc.wstx.sr.StreamScanner.throwParseError(StreamScanner.java:491) at com.ctc.wstx.sr.StreamScanner.throwParseError(StreamScanner.java:475) at com.ctc.wstx.sr.BasicStreamReader.reportWrongEndElem(BasicStreamReader.java:3365) at com.ctc.wstx.sr.BasicStreamReader.readEndElem(BasicStreamReader.java:3292) at com.ctc.wstx.sr.BasicStreamReader.nextFromTree(BasicStreamReader.java:2911) at com.ctc.wstx.sr.BasicStreamReader.next(BasicStreamReader.java:1123) at org.apache.hadoop.conf.Configuration$Parser.parseNext(Configuration.java:3320) at org.apache.hadoop.conf.Configuration$Parser.parse(Configuration.java:3114) at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3007) at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2968) at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2848) at org.apache.hadoop.conf.Configuration.get(Configuration.java:1200) at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1812) at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1789) at org.apache.hadoop.util.ShutdownHookManager.getShutdownTimeout(ShutdownHookManager.java:183) at org.apache.hadoop.util.ShutdownHookManager.shutdownExecutor(ShutdownHookManager.java:145) at org.apache.hadoop.util.ShutdownHookManager.access$300(ShutdownHookManager.java:65) at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:102) Exception in thread "Thread-1" java.lang.RuntimeException: com.ctc.wstx.exc.WstxParsingException: Unexpected close tag </property>; expected </value>. at [row,col,system-id]: [23,11,"file:/opt/soft/hadoop/etc/hadoop/mapred-site.xml"] at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3024) at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2968) at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2848) at org.apache.hadoop.conf.Configuration.get(Configuration.java:1200) at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1812) at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1789) at org.apache.hadoop.util.ShutdownHookManager.getShutdownTimeout(ShutdownHookManager.java:183) at org.apache.hadoop.util.ShutdownHookManager.shutdownExecutor(ShutdownHookManager.java:145) at org.apache.hadoop.util.ShutdownHookManager.access$300(ShutdownHookManager.java:65) at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:102) Caused by: com.ctc.wstx.exc.WstxParsingException: Unexpected close tag </property>; expected </value>. at [row,col,system-id]: [23,11,"file:/opt/soft/hadoop/etc/hadoop/mapred-site.xml"] at com.ctc.wstx.sr.StreamScanner.constructWfcException(StreamScanner.java:621) at com.ctc.wstx.sr.StreamScanner.throwParseError(StreamScanner.java:491) at com.ctc.wstx.sr.StreamScanner.throwParseError(StreamScanner.java:475) at com.ctc.wstx.sr.BasicStreamReader.reportWrongEndElem(BasicStreamReader.java:3365) at com.ctc.wstx.sr.BasicStreamReader.readEndElem(BasicStreamReader.java:3292) at com.ctc.wstx.sr.BasicStreamReader.nextFromTree(BasicStreamReader.java:2911) at com.ctc.wstx.sr.BasicStreamReader.next(BasicStreamReader.java:1123) at org.apache.hadoop.conf.Configuration$Parser.parseNext(Configuration.java:3320) at org.apache.hadoop.conf.Configuration$Parser.parse(Configuration.java:3114) at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3007) ... 9 more 现在呢?
10-11
at telephoneProject.DataClean.main(DataClean.java:114) INFO [main] - Cleaning up the staging area file:/tmp/hadoop/mapred/staging/1023178433931/.staging/job_local78433931_0001 Exception in thread "main" java.io.IOException: Failed on local exception: org.apache.hadoop.ipc.RpcException: RPC response exceeds maximum data length; Host Details : local host is: "chen/192.168.66.121"; destination host is: "node101":9870; at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:816) at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1549) at org.apache.hadoop.ipc.Client.call(Client.java:1491) at org.apache.hadoop.ipc.Client.call(Client.java:1388) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:118) at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:904) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:422) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeMethod(RetryInvocationHandler.java:165) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invoke(RetryInvocationHandler.java:157) at org.apache.hadoop.io.retry.RetryInvocationHandler$Call.invokeOnce(RetryInvocationHandler.java:95) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:359) at com.sun.proxy.$Proxy10.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1661) at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1577) at org.apache.hadoop.hdfs.DistributedFileSystem$29.doCall(DistributedFileSystem.java:1574) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1589) at org.apache.hadoop.fs.Globber.getFileStatus(Globber.java:65) at org.apache.hadoop.fs.Globber.doGlob(Globber.java:281) at org.apache.hadoop.fs.Globber.glob(Globber.java:149) at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:2034) at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.singleThreadedListStatus(FileInputFormat.java:303) at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:274) at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:396) at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:310) at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:327) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:200) at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1570) at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1567) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1567) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1588) at telephoneProject.DataClean.main(DataClean.java:114) Caused by: org.apache.hadoop.ipc.RpcException: RPC response exceeds maximum data length at org.apache.hadoop.ipc.Client$IpcStreams.readResponse(Client.java:1864) at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1183) at org.apache.hadoop.ipc.Client$Connection.run(Client.java:1079)
07-02
[root@master conf]# spark-submit --master yarn --class org.apache.spark.examples.SparkPi $SPARK_HOME/examples/jars/spark-examples_2.12-3.1.1.jar 25/11/12 04:52:13 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 25/11/12 04:52:14 INFO SparkContext: Running Spark version 3.1.1 25/11/12 04:52:14 INFO ResourceUtils: ============================================================== 25/11/12 04:52:14 INFO ResourceUtils: No custom resources configured for spark.driver. 25/11/12 04:52:14 INFO ResourceUtils: ============================================================== 25/11/12 04:52:14 INFO SparkContext: Submitted application: Spark Pi 25/11/12 04:52:14 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0) 25/11/12 04:52:14 INFO ResourceProfile: Limiting resource is cpus at 1 tasks per executor 25/11/12 04:52:14 INFO ResourceProfileManager: Added ResourceProfile id: 0 25/11/12 04:52:14 INFO SecurityManager: Changing view acls to: root 25/11/12 04:52:14 INFO SecurityManager: Changing modify acls to: root 25/11/12 04:52:14 INFO SecurityManager: Changing view acls groups to: 25/11/12 04:52:14 INFO SecurityManager: Changing modify acls groups to: 25/11/12 04:52:14 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set() 25/11/12 04:52:15 INFO Utils: Successfully started service 'sparkDriver' on port 34815. 25/11/12 04:52:15 INFO SparkEnv: Registering MapOutputTracker 25/11/12 04:52:15 INFO SparkEnv: Registering BlockManagerMaster 25/11/12 04:52:15 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 25/11/12 04:52:15 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 25/11/12 04:52:15 INFO SparkEnv: Registering BlockManagerMasterHeartbeat 25/11/12 04:52:15 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-3940abd9-f976-4673-ac49-5bffc32a9ec4 25/11/12 04:52:15 INFO MemoryStore: MemoryStore started with capacity 413.9 MiB 25/11/12 04:52:15 INFO SparkEnv: Registering OutputCommitCoordinator 25/11/12 04:52:15 INFO Utils: Successfully started service 'SparkUI' on port 4040. 25/11/12 04:52:15 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://master:4040 25/11/12 04:52:15 INFO SparkContext: Added JAR file:/opt/module/spark/examples/jars/spark-examples_2.12-3.1.1.jar at spark://master:34815/jars/spark-examples_2.12-3.1.1.jar with timestamp 1762894334373 25/11/12 04:52:16 INFO RMProxy: Connecting to ResourceManager at /0.0.0.0:8032 25/11/12 04:52:16 INFO Client: Requesting a new application from cluster with 3 NodeManagers 25/11/12 04:52:17 INFO Configuration: resource-types.xml not found 25/11/12 04:52:17 INFO ResourceUtils: Unable to find 'resource-types.xml'. 25/11/12 04:52:17 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per container) 25/11/12 04:52:17 INFO Client: Will allocate AM container, with 896 MB memory including 384 MB overhead 25/11/12 04:52:17 INFO Client: Setting up container launch context for our AM 25/11/12 04:52:17 INFO Client: Setting up the launch environment for our AM container 25/11/12 04:52:17 INFO Client: Preparing resources for our AM container 25/11/12 04:52:17 WARN Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME. 25/11/12 04:52:18 INFO Client: Uploading resource file:/tmp/spark-0eb147ed-0ede-4f24-90b9-65a8f65aa65b/__spark_libs__1298849544083591332.zip -> file:/root/.sparkStaging/application_1762894202540_0002/__spark_libs__1298849544083591332.zip 25/11/12 04:52:19 INFO Client: Uploading resource file:/tmp/spark-0eb147ed-0ede-4f24-90b9-65a8f65aa65b/__spark_conf__2273853142442443753.zip -> file:/root/.sparkStaging/application_1762894202540_0002/__spark_conf__.zip 25/11/12 04:52:19 INFO SecurityManager: Changing view acls to: root 25/11/12 04:52:19 INFO SecurityManager: Changing modify acls to: root 25/11/12 04:52:19 INFO SecurityManager: Changing view acls groups to: 25/11/12 04:52:19 INFO SecurityManager: Changing modify acls groups to: 25/11/12 04:52:19 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set() 25/11/12 04:52:19 INFO Client: Submitting application application_1762894202540_0002 to ResourceManager 25/11/12 04:52:20 INFO YarnClientImpl: Submitted application application_1762894202540_0002 25/11/12 04:52:21 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) 25/11/12 04:52:21 INFO Client: client token: N/A diagnostics: [星期三 十一月 12 04:52:20 +0800 2025] Scheduler has assigned a container for AM, waiting for AM container to be launched ApplicationMaster host: N/A ApplicationMaster RPC port: -1 queue: default start time: 1762894339943 final status: UNDEFINED tracking URL: http://master:8088/proxy/application_1762894202540_0002/ user: root 25/11/12 04:52:22 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) 25/11/12 04:52:23 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) 25/11/12 04:52:24 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) 25/11/12 04:52:25 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) 25/11/12 04:52:26 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) :25/11/12 04:52:27 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) 25/11/12 04:52:28 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) 25/11/12 04:52:29 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) ^C25/11/12 04:52:30 INFO DiskBlockManager: Shutdown hook called 25/11/12 04:52:30 INFO ShutdownHookManager: Shutdown hook called 25/11/12 04:52:30 INFO ShutdownHookManager: Deleting directory /tmp/spark-c213ed8f-d019-49b8-af3c-48d7d225c929 25/11/12 04:52:30 INFO ShutdownHookManager: Deleting directory /tmp/spark-0eb147ed-0ede-4f24-90b9-65a8f65aa65b/userFiles-0c002819-b899-477a-8142-ea070ae03495 25/11/12 04:52:30 INFO ShutdownHookManager: Deleting directory /tmp/spark-0eb147ed-0ede-4f24-90b9-65a8f65aa65b [root@master conf]# vi spark-env.sh [root@master conf]# spark-submit --master yarn --class org.apache.spark.examples.SparkPi $SPARK_HOME/examples/jars/spark-examples_2.12.jar 2025-11-12 04:52:57,994 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2025-11-12 04:52:58,381 WARN deploy.DependencyUtils: Local jar /opt/module/spark/examples/jars/spark-examples_2.12.jar does not exist, skipping. Error: Failed to load class org.apache.spark.examples.SparkPi. 2025-11-12 04:52:58,394 INFO util.ShutdownHookManager: Shutdown hook called 2025-11-12 04:52:58,394 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-7e0279a0-e7f1-4002-a93b-a217deae6472 [root@master conf]# spark-submit --master yarn --class org.apache.spark.examples.SparkPi $SPARK_HOME/examples/jars/spark-examples.jar 2025-11-12 04:53:09,709 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2025-11-12 04:53:09,983 WARN deploy.DependencyUtils: Local jar /opt/module/spark/examples/jars/spark-examples.jar does not exist, skipping. Error: Failed to load class org.apache.spark.examples.SparkPi. 2025-11-12 04:53:09,997 INFO util.ShutdownHookManager: Shutdown hook called 2025-11-12 04:53:09,998 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-ced88073-ab63-4a74-881c-ec43e436e161 [root@master conf]# spark-submit --master yarn --class org.apache.spark.examples.SparkPi $SPARK_HOME/examples/jars/spark-examples_2.12-3.1.1.jar 2025-11-12 04:53:15,900 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2025-11-12 04:53:16,256 INFO spark.SparkContext: Running Spark version 3.1.1 2025-11-12 04:53:16,332 INFO resource.ResourceUtils: ============================================================== 2025-11-12 04:53:16,332 INFO resource.ResourceUtils: No custom resources configured for spark.driver. 2025-11-12 04:53:16,332 INFO resource.ResourceUtils: ============================================================== 2025-11-12 04:53:16,333 INFO spark.SparkContext: Submitted application: Spark Pi 2025-11-12 04:53:16,377 INFO resource.ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0) 2025-11-12 04:53:16,435 INFO resource.ResourceProfile: Limiting resource is cpus at 1 tasks per executor 2025-11-12 04:53:16,437 INFO resource.ResourceProfileManager: Added ResourceProfile id: 0 2025-11-12 04:53:16,566 INFO spark.SecurityManager: Changing view acls to: root 2025-11-12 04:53:16,566 INFO spark.SecurityManager: Changing modify acls to: root 2025-11-12 04:53:16,566 INFO spark.SecurityManager: Changing view acls groups to: 2025-11-12 04:53:16,566 INFO spark.SecurityManager: Changing modify acls groups to: 2025-11-12 04:53:16,566 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set() 2025-11-12 04:53:16,890 INFO util.Utils: Successfully started service 'sparkDriver' on port 33022. 2025-11-12 04:53:16,935 INFO spark.SparkEnv: Registering MapOutputTracker 2025-11-12 04:53:16,980 INFO spark.SparkEnv: Registering BlockManagerMaster 2025-11-12 04:53:17,006 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 2025-11-12 04:53:17,007 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 2025-11-12 04:53:17,094 INFO spark.SparkEnv: Registering BlockManagerMasterHeartbeat 2025-11-12 04:53:17,114 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-6a6708df-05b8-407f-9011-dd935e59e38f 2025-11-12 04:53:17,140 INFO memory.MemoryStore: MemoryStore started with capacity 413.9 MiB 2025-11-12 04:53:17,206 INFO spark.SparkEnv: Registering OutputCommitCoordinator 2025-11-12 04:53:17,346 INFO util.log: Logging initialized @3309ms to org.sparkproject.jetty.util.log.Slf4jLog 2025-11-12 04:53:17,506 INFO server.Server: jetty-9.4.36.v20210114; built: 2021-01-14T16:44:28.689Z; git: 238ec6997c7806b055319a6d11f8ae7564adc0de; jvm 1.8.0_211-b12 2025-11-12 04:53:17,590 INFO server.Server: Started @3554ms 2025-11-12 04:53:17,704 INFO server.AbstractConnector: Started ServerConnector@68ed96ca{HTTP/1.1, (http/1.1)}{0.0.0.0:4040} 2025-11-12 04:53:17,705 INFO util.Utils: Successfully started service 'SparkUI' on port 4040. 2025-11-12 04:53:17,734 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@a23a01d{/jobs,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,736 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@61a5b4ae{/jobs/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,736 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5b69fd74{/jobs/job,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,744 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@63a5e46c{/jobs/job/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,744 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@49ef32e0{/stages,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,745 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6bd51ed8{/stages/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,746 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@51abf713{/stages/stage,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,747 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3fc08eec{/stages/stage/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,748 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7b02e036{/stages/pool,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,748 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1e287667{/stages/pool/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,749 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4201a617{/storage,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,749 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1bb9aa43{/storage/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,750 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@df5f5c0{/storage/rdd,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,751 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@66b72664{/storage/rdd/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,751 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@58cd06cb{/environment,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,752 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@64b31700{/environment/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,753 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@bae47a0{/executors,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,753 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@85ec632{/executors/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,754 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@65ef722a{/executors/threadDump,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,760 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@214894fc{/executors/threadDump/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,772 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@e362c57{/static,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,773 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@24528a25{/,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,775 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@59221b97{/api,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,775 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3ee39da0{/jobs/job/kill,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,776 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7cc9ce8{/stages/stage/kill,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,778 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://master:4040 2025-11-12 04:53:17,821 INFO spark.SparkContext: Added JAR file:/opt/module/spark/examples/jars/spark-examples_2.12-3.1.1.jar at spark://master:33022/jars/spark-examples_2.12-3.1.1.jar with timestamp 1762894396246 2025-11-12 04:53:18,269 INFO client.RMProxy: Connecting to ResourceManager at master/192.168.43.100:8032 2025-11-12 04:53:18,540 INFO yarn.Client: Requesting a new application from cluster with 3 NodeManagers 2025-11-12 04:53:19,385 INFO conf.Configuration: resource-types.xml not found 2025-11-12 04:53:19,385 INFO resource.ResourceUtils: Unable to find 'resource-types.xml'. 2025-11-12 04:53:19,413 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per container) 2025-11-12 04:53:19,413 INFO yarn.Client: Will allocate AM container, with 896 MB memory including 384 MB overhead 2025-11-12 04:53:19,414 INFO yarn.Client: Setting up container launch context for our AM 2025-11-12 04:53:19,414 INFO yarn.Client: Setting up the launch environment for our AM container 2025-11-12 04:53:19,428 INFO yarn.Client: Preparing resources for our AM container 2025-11-12 04:53:19,492 WARN yarn.Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME. 2025-11-12 04:53:21,016 INFO yarn.Client: Uploading resource file:/tmp/spark-9cf7b4b0-d6e0-4258-b2ea-7edb7303c330/__spark_libs__4318401642194911955.zip -> hdfs://master:9000/user/root/.sparkStaging/application_1762894202540_0003/__spark_libs__4318401642194911955.zip 2025-11-12 04:53:24,623 INFO yarn.Client: Uploading resource file:/tmp/spark-9cf7b4b0-d6e0-4258-b2ea-7edb7303c330/__spark_conf__5067439934985698710.zip -> hdfs://master:9000/user/root/.sparkStaging/application_1762894202540_0003/__spark_conf__.zip 2025-11-12 04:53:25,145 INFO spark.SecurityManager: Changing view acls to: root 2025-11-12 04:53:25,145 INFO spark.SecurityManager: Changing modify acls to: root 2025-11-12 04:53:25,145 INFO spark.SecurityManager: Changing view acls groups to: 2025-11-12 04:53:25,145 INFO spark.SecurityManager: Changing modify acls groups to: 2025-11-12 04:53:25,145 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set() 2025-11-12 04:53:25,182 INFO yarn.Client: Submitting application application_1762894202540_0003 to ResourceManager 2025-11-12 04:53:25,232 INFO impl.YarnClientImpl: Submitted application application_1762894202540_0003 2025-11-12 04:53:26,236 INFO yarn.Client: Application report for application_1762894202540_0003 (state: ACCEPTED) 2025-11-12 04:53:26,238 INFO yarn.Client: client token: N/A diagnostics: [星期三 十一月 12 04:53:25 +0800 2025] Scheduler has assigned a container for AM, waiting for AM container to be launched ApplicationMaster host: N/A ApplicationMaster RPC port: -1 queue: default start time: 1762894405197 final status: UNDEFINED tracking URL: http://master:8088/proxy/application_1762894202540_0003/ user: root 2025-11-12 04:53:27,242 INFO yarn.Client: Application report for application_1762894202540_0003 (state: ACCEPTED) 2025-11-12 04:53:28,247 INFO yarn.Client: Application report for application_1762894202540_0003 (state: ACCEPTED) 2025-11-12 04:53:29,250 INFO yarn.Client: Application report for application_1762894202540_0003 (state: ACCEPTED) 2025-11-12 04:53:30,253 INFO yarn.Client: Application report for application_1762894202540_0003 (state: ACCEPTED) 2025-11-12 04:53:31,257 INFO yarn.Client: Application report for application_1762894202540_0003 (state: ACCEPTED) ^C2025-11-12 04:53:32,204 INFO storage.DiskBlockManager: Shutdown hook called 2025-11-12 04:53:32,211 INFO util.ShutdownHookManager: Shutdown hook called 2025-11-12 04:53:32,211 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-9cf7b4b0-d6e0-4258-b2ea-7edb7303c330 2025-11-12 04:53:32,215 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-b7982e3f-5f51-4b89-8b11-120862f2a08c 2025-11-12 04:53:32,216 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-9cf7b4b0-d6e0-4258-b2ea-7edb7303c330/userFiles-3eb53278-d265-42c2-9531-d8811165f34d [root@master conf]# spark-submit --master yarn --class org.apache.spark.examples.SparkPi $SPARK_HOME/examples/jars/spark-examples_2.12-3.1.1.jar 25/11/12 04:52:13 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 25/11/12 04:52:14 INFO SparkContext: Running Spark version 3.1.1 25/11/12 04:52:14 INFO ResourceUtils: ============================================================== 25/11/12 04:52:14 INFO ResourceUtils: No custom resources configured for spark.driver. 25/11/12 04:52:14 INFO ResourceUtils: ============================================================== 25/11/12 04:52:14 INFO SparkContext: Submitted application: Spark Pi 25/11/12 04:52:14 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0) 25/11/12 04:52:14 INFO ResourceProfile: Limiting resource is cpus at 1 tasks per executor 25/11/12 04:52:14 INFO ResourceProfileManager: Added ResourceProfile id: 0 25/11/12 04:52:14 INFO SecurityManager: Changing view acls to: root 25/11/12 04:52:14 INFO SecurityManager: Changing modify acls to: root 25/11/12 04:52:14 INFO SecurityManager: Changing view acls groups to: 25/11/12 04:52:14 INFO SecurityManager: Changing modify acls groups to: 25/11/12 04:52:14 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set() 25/11/12 04:52:15 INFO Utils: Successfully started service 'sparkDriver' on port 34815. 25/11/12 04:52:15 INFO SparkEnv: Registering MapOutputTracker 25/11/12 04:52:15 INFO SparkEnv: Registering BlockManagerMaster 25/11/12 04:52:15 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 25/11/12 04:52:15 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 25/11/12 04:52:15 INFO SparkEnv: Registering BlockManagerMasterHeartbeat 25/11/12 04:52:15 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-3940abd9-f976-4673-ac49-5bffc32a9ec4 25/11/12 04:52:15 INFO MemoryStore: MemoryStore started with capacity 413.9 MiB 25/11/12 04:52:15 INFO SparkEnv: Registering OutputCommitCoordinator 25/11/12 04:52:15 INFO Utils: Successfully started service 'SparkUI' on port 4040. 25/11/12 04:52:15 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://master:4040 25/11/12 04:52:15 INFO SparkContext: Added JAR file:/opt/module/spark/examples/jars/spark-examples_2.12-3.1.1.jar at spark://master:34815/jars/spark-examples_2.12-3.1.1.jar with timestamp 1762894334373 25/11/12 04:52:16 INFO RMProxy: Connecting to ResourceManager at /0.0.0.0:8032 25/11/12 04:52:16 INFO Client: Requesting a new application from cluster with 3 NodeManagers 25/11/12 04:52:17 INFO Configuration: resource-types.xml not found 25/11/12 04:52:17 INFO ResourceUtils: Unable to find 'resource-types.xml'. 25/11/12 04:52:17 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per container) 25/11/12 04:52:17 INFO Client: Will allocate AM container, with 896 MB memory including 384 MB overhead 25/11/12 04:52:17 INFO Client: Setting up container launch context for our AM 25/11/12 04:52:17 INFO Client: Setting up the launch environment for our AM container 25/11/12 04:52:17 INFO Client: Preparing resources for our AM container 25/11/12 04:52:17 WARN Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME. 25/11/12 04:52:18 INFO Client: Uploading resource file:/tmp/spark-0eb147ed-0ede-4f24-90b9-65a8f65aa65b/__spark_libs__1298849544083591332.zip -> file:/root/.sparkStaging/application_1762894202540_0002/__spark_libs__1298849544083591332.zip 25/11/12 04:52:19 INFO Client: Uploading resource file:/tmp/spark-0eb147ed-0ede-4f24-90b9-65a8f65aa65b/__spark_conf__2273853142442443753.zip -> file:/root/.sparkStaging/application_1762894202540_0002/__spark_conf__.zip 25/11/12 04:52:19 INFO SecurityManager: Changing view acls to: root 25/11/12 04:52:19 INFO SecurityManager: Changing modify acls to: root 25/11/12 04:52:19 INFO SecurityManager: Changing view acls groups to: 25/11/12 04:52:19 INFO SecurityManager: Changing modify acls groups to: 25/11/12 04:52:19 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set() 25/11/12 04:52:19 INFO Client: Submitting application application_1762894202540_0002 to ResourceManager 25/11/12 04:52:20 INFO YarnClientImpl: Submitted application application_1762894202540_0002 25/11/12 04:52:21 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) 25/11/12 04:52:21 INFO Client: client token: N/A diagnostics: [星期三 十一月 12 04:52:20 +0800 2025] Scheduler has assigned a container for AM, waiting for AM container to be launched ApplicationMaster host: N/A ApplicationMaster RPC port: -1 queue: default start time: 1762894339943 final status: UNDEFINED tracking URL: http://master:8088/proxy/application_1762894202540_0002/ user: root 25/11/12 04:52:22 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) 25/11/12 04:52:23 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) 25/11/12 04:52:24 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) 25/11/12 04:52:25 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) 25/11/12 04:52:26 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) :25/11/12 04:52:27 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) 25/11/12 04:52:28 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) 25/11/12 04:52:29 INFO Client: Application report for application_1762894202540_0002 (state: ACCEPTED) ^C25/11/12 04:52:30 INFO DiskBlockManager: Shutdown hook called 25/11/12 04:52:30 INFO ShutdownHookManager: Shutdown hook called 25/11/12 04:52:30 INFO ShutdownHookManager: Deleting directory /tmp/spark-c213ed8f-d019-49b8-af3c-48d7d225c929 25/11/12 04:52:30 INFO ShutdownHookManager: Deleting directory /tmp/spark-0eb147ed-0ede-4f24-90b9-65a8f65aa65b/userFiles-0c002819-b899-477a-8142-ea070ae03495 25/11/12 04:52:30 INFO ShutdownHookManager: Deleting directory /tmp/spark-0eb147ed-0ede-4f24-90b9-65a8f65aa65b [root@master conf]# vi spark-env.sh [root@master conf]# spark-submit --master yarn --class org.apache.spark.examples.SparkPi $SPARK_HOME/examples/jars/spark-examples_2.12.jar 2025-11-12 04:52:57,994 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2025-11-12 04:52:58,381 WARN deploy.DependencyUtils: Local jar /opt/module/spark/examples/jars/spark-examples_2.12.jar does not exist, skipping. Error: Failed to load class org.apache.spark.examples.SparkPi. 2025-11-12 04:52:58,394 INFO util.ShutdownHookManager: Shutdown hook called 2025-11-12 04:52:58,394 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-7e0279a0-e7f1-4002-a93b-a217deae6472 [root@master conf]# spark-submit --master yarn --class org.apache.spark.examples.SparkPi $SPARK_HOME/examples/jars/spark-examples.jar 2025-11-12 04:53:09,709 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2025-11-12 04:53:09,983 WARN deploy.DependencyUtils: Local jar /opt/module/spark/examples/jars/spark-examples.jar does not exist, skipping. Error: Failed to load class org.apache.spark.examples.SparkPi. 2025-11-12 04:53:09,997 INFO util.ShutdownHookManager: Shutdown hook called 2025-11-12 04:53:09,998 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-ced88073-ab63-4a74-881c-ec43e436e161 [root@master conf]# spark-submit --master yarn --class org.apache.spark.examples.SparkPi $SPARK_HOME/examples/jars/spark-examples_2.12-3.1.1.jar 2025-11-12 04:53:15,900 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2025-11-12 04:53:16,256 INFO spark.SparkContext: Running Spark version 3.1.1 2025-11-12 04:53:16,332 INFO resource.ResourceUtils: ============================================================== 2025-11-12 04:53:16,332 INFO resource.ResourceUtils: No custom resources configured for spark.driver. 2025-11-12 04:53:16,332 INFO resource.ResourceUtils: ============================================================== 2025-11-12 04:53:16,333 INFO spark.SparkContext: Submitted application: Spark Pi 2025-11-12 04:53:16,377 INFO resource.ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0) 2025-11-12 04:53:16,435 INFO resource.ResourceProfile: Limiting resource is cpus at 1 tasks per executor 2025-11-12 04:53:16,437 INFO resource.ResourceProfileManager: Added ResourceProfile id: 0 2025-11-12 04:53:16,566 INFO spark.SecurityManager: Changing view acls to: root 2025-11-12 04:53:16,566 INFO spark.SecurityManager: Changing modify acls to: root 2025-11-12 04:53:16,566 INFO spark.SecurityManager: Changing view acls groups to: 2025-11-12 04:53:16,566 INFO spark.SecurityManager: Changing modify acls groups to: 2025-11-12 04:53:16,566 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set() 2025-11-12 04:53:16,890 INFO util.Utils: Successfully started service 'sparkDriver' on port 33022. 2025-11-12 04:53:16,935 INFO spark.SparkEnv: Registering MapOutputTracker 2025-11-12 04:53:16,980 INFO spark.SparkEnv: Registering BlockManagerMaster 2025-11-12 04:53:17,006 INFO storage.BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 2025-11-12 04:53:17,007 INFO storage.BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 2025-11-12 04:53:17,094 INFO spark.SparkEnv: Registering BlockManagerMasterHeartbeat 2025-11-12 04:53:17,114 INFO storage.DiskBlockManager: Created local directory at /tmp/blockmgr-6a6708df-05b8-407f-9011-dd935e59e38f 2025-11-12 04:53:17,140 INFO memory.MemoryStore: MemoryStore started with capacity 413.9 MiB 2025-11-12 04:53:17,206 INFO spark.SparkEnv: Registering OutputCommitCoordinator 2025-11-12 04:53:17,346 INFO util.log: Logging initialized @3309ms to org.sparkproject.jetty.util.log.Slf4jLog 2025-11-12 04:53:17,506 INFO server.Server: jetty-9.4.36.v20210114; built: 2021-01-14T16:44:28.689Z; git: 238ec6997c7806b055319a6d11f8ae7564adc0de; jvm 1.8.0_211-b12 2025-11-12 04:53:17,590 INFO server.Server: Started @3554ms 2025-11-12 04:53:17,704 INFO server.AbstractConnector: Started ServerConnector@68ed96ca{HTTP/1.1, (http/1.1)}{0.0.0.0:4040} 2025-11-12 04:53:17,705 INFO util.Utils: Successfully started service 'SparkUI' on port 4040. 2025-11-12 04:53:17,734 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@a23a01d{/jobs,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,736 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@61a5b4ae{/jobs/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,736 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@5b69fd74{/jobs/job,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,744 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@63a5e46c{/jobs/job/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,744 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@49ef32e0{/stages,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,745 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@6bd51ed8{/stages/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,746 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@51abf713{/stages/stage,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,747 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3fc08eec{/stages/stage/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,748 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7b02e036{/stages/pool,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,748 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1e287667{/stages/pool/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,749 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@4201a617{/storage,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,749 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@1bb9aa43{/storage/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,750 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@df5f5c0{/storage/rdd,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,751 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@66b72664{/storage/rdd/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,751 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@58cd06cb{/environment,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,752 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@64b31700{/environment/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,753 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@bae47a0{/executors,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,753 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@85ec632{/executors/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,754 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@65ef722a{/executors/threadDump,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,760 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@214894fc{/executors/threadDump/json,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,772 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@e362c57{/static,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,773 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@24528a25{/,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,775 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@59221b97{/api,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,775 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@3ee39da0{/jobs/job/kill,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,776 INFO handler.ContextHandler: Started o.s.j.s.ServletContextHandler@7cc9ce8{/stages/stage/kill,null,AVAILABLE,@Spark} 2025-11-12 04:53:17,778 INFO ui.SparkUI: Bound SparkUI to 0.0.0.0, and started at http://master:4040 2025-11-12 04:53:17,821 INFO spark.SparkContext: Added JAR file:/opt/module/spark/examples/jars/spark-examples_2.12-3.1.1.jar at spark://master:33022/jars/spark-examples_2.12-3.1.1.jar with timestamp 1762894396246 2025-11-12 04:53:18,269 INFO client.RMProxy: Connecting to ResourceManager at master/192.168.43.100:8032 2025-11-12 04:53:18,540 INFO yarn.Client: Requesting a new application from cluster with 3 NodeManagers 2025-11-12 04:53:19,385 INFO conf.Configuration: resource-types.xml not found 2025-11-12 04:53:19,385 INFO resource.ResourceUtils: Unable to find 'resource-types.xml'. 2025-11-12 04:53:19,413 INFO yarn.Client: Verifying our application has not requested more than the maximum memory capability of the cluster (8192 MB per container) 2025-11-12 04:53:19,413 INFO yarn.Client: Will allocate AM container, with 896 MB memory including 384 MB overhead 2025-11-12 04:53:19,414 INFO yarn.Client: Setting up container launch context for our AM 2025-11-12 04:53:19,414 INFO yarn.Client: Setting up the launch environment for our AM container 2025-11-12 04:53:19,428 INFO yarn.Client: Preparing resources for our AM container 2025-11-12 04:53:19,492 WARN yarn.Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME. 2025-11-12 04:53:21,016 INFO yarn.Client: Uploading resource file:/tmp/spark-9cf7b4b0-d6e0-4258-b2ea-7edb7303c330/__spark_libs__4318401642194911955.zip -> hdfs://master:9000/user/root/.sparkStaging/application_1762894202540_0003/__spark_libs__4318401642194911955.zip 2025-11-12 04:53:24,623 INFO yarn.Client: Uploading resource file:/tmp/spark-9cf7b4b0-d6e0-4258-b2ea-7edb7303c330/__spark_conf__5067439934985698710.zip -> hdfs://master:9000/user/root/.sparkStaging/application_1762894202540_0003/__spark_conf__.zip 2025-11-12 04:53:25,145 INFO spark.SecurityManager: Changing view acls to: root 2025-11-12 04:53:25,145 INFO spark.SecurityManager: Changing modify acls to: root 2025-11-12 04:53:25,145 INFO spark.SecurityManager: Changing view acls groups to: 2025-11-12 04:53:25,145 INFO spark.SecurityManager: Changing modify acls groups to: 2025-11-12 04:53:25,145 INFO spark.SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set() 2025-11-12 04:53:25,182 INFO yarn.Client: Submitting application application_1762894202540_0003 to ResourceManager 2025-11-12 04:53:25,232 INFO impl.YarnClientImpl: Submitted application application_1762894202540_0003 2025-11-12 04:53:26,236 INFO yarn.Client: Application report for application_1762894202540_0003 (state: ACCEPTED) 2025-11-12 04:53:26,238 INFO yarn.Client: client token: N/A diagnostics: [星期三 十一月 12 04:53:25 +0800 2025] Scheduler has assigned a container for AM, waiting for AM container to be launched ApplicationMaster host: N/A ApplicationMaster RPC port: -1 queue: default start time: 1762894405197 final status: UNDEFINED tracking URL: http://master:8088/proxy/application_1762894202540_0003/ user: root 2025-11-12 04:53:27,242 INFO yarn.Client: Application report for application_1762894202540_0003 (state: ACCEPTED) 2025-11-12 04:53:28,247 INFO yarn.Client: Application report for application_1762894202540_0003 (state: ACCEPTED) 2025-11-12 04:53:29,250 INFO yarn.Client: Application report for application_1762894202540_0003 (state: ACCEPTED) 2025-11-12 04:53:30,253 INFO yarn.Client: Application report for application_1762894202540_0003 (state: ACCEPTED) 2025-11-12 04:53:31,257 INFO yarn.Client: Application report for application_1762894202540_0003 (state: ACCEPTED) ^C2025-11-12 04:53:32,204 INFO storage.DiskBlockManager: Shutdown hook called 2025-11-12 04:53:32,211 INFO util.ShutdownHookManager: Shutdown hook called 2025-11-12 04:53:32,211 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-9cf7b4b0-d6e0-4258-b2ea-7edb7303c330 2025-11-12 04:53:32,215 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-b7982e3f-5f51-4b89-8b11-120862f2a08c 2025-11-12 04:53:32,216 INFO util.ShutdownHookManager: Deleting directory /tmp/spark-9cf7b4b0-d6e0-4258-b2ea-7edb7303c330/userFiles-3eb53278-d265-42c2-9531-d8811165f34d
最新发布
11-13
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值