hadoop-client和jetty的冲突解决

本文介绍了如何在项目中配置Apache Spark和Hadoop的依赖项,排除了潜在冲突的库,如Netty和Jetty,确保了集群环境下的稳定运行。
DEPRECATED: Use of this script to execute hdfs command is deprecated. Instead use the hdfs command for it. 2025-06-18 16:50:39,734 INFO datanode.DataNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting DataNode STARTUP_MSG: host = LAPTOP-FK5QKFGQ/192.168.10.1 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.2.2 STARTUP_MSG: classpath = D:\pyspark\Hadoop\hadoop-3.2.2\etc\hadoop;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\accessors-smart-1.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\animal-sniffer-annotations-1.17.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\asm-5.0.4.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\audience-annotations-0.5.0.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\avro-1.7.7.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\checker-qual-2.5.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\commons-beanutils-1.9.4.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\commons-cli-1.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\commons-codec-1.11.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\commons-collections-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\commons-compress-1.19.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\commons-configuration2-2.1.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\commons-io-2.5.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\commons-lang3-3.7.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\commons-logging-1.1.3.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\commons-math3-3.1.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\commons-net-3.6.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\commons-text-1.4.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\curator-client-2.13.0.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\curator-framework-2.13.0.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\curator-recipes-2.13.0.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\dnsjava-2.1.7.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\error_prone_annotations-2.2.0.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\failureaccess-1.0.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\gson-2.2.4.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\guava-27.0-jre.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\hadoop-annotations-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\hadoop-auth-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\htrace-core4-4.1.0-incubating.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\httpclient-4.5.13.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\httpcore-4.4.13.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\j2objc-annotations-1.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\jackson-annotations-2.9.10.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\jackson-core-2.9.10.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\jackson-core-asl-1.9.13.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\jackson-databind-2.9.10.4.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\jackson-jaxrs-1.9.13.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\jackson-mapper-asl-1.9.13.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\jackson-xc-1.9.13.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\javax.activation-api-1.2.0.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\javax.servlet-api-3.1.0.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\jaxb-api-2.2.11.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\jaxb-impl-2.2.3-1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\jcip-annotations-1.0-1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\jersey-core-1.19.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\jersey-json-1.19.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\jersey-server-1.19.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\jersey-servlet-1.19.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\jettison-1.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\jetty-http-9.4.20.v20190813.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\jetty-io-9.4.20.v20190813.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\jetty-security-9.4.20.v20190813.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\jetty-server-9.4.20.v20190813.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\jetty-servlet-9.4.20.v20190813.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\jetty-util-9.4.20.v20190813.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\jetty-webapp-9.4.20.v20190813.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\jetty-xml-9.4.20.v20190813.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\jsch-0.1.55.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\json-smart-2.3.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\jsp-api-2.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\jsr305-3.0.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\jsr311-api-1.1.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\jul-to-slf4j-1.7.25.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\kerb-admin-1.0.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\kerb-client-1.0.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\kerb-common-1.0.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\kerb-core-1.0.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\kerb-crypto-1.0.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\kerb-identity-1.0.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\kerb-server-1.0.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\kerb-simplekdc-1.0.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\kerb-util-1.0.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\kerby-asn1-1.0.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\kerby-config-1.0.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\kerby-pkix-1.0.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\kerby-util-1.0.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\kerby-xdr-1.0.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\log4j-1.2.17.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\metrics-core-3.2.4.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\netty-3.10.6.Final.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\nimbus-jose-jwt-7.9.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\paranamer-2.3.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\protobuf-java-2.5.0.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\re2j-1.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\slf4j-api-1.7.25.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\slf4j-log4j12-1.7.25.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\snappy-java-1.0.5.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\stax2-api-3.1.4.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\token-provider-1.0.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\woodstox-core-5.0.3.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\lib\zookeeper-3.4.13.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\hadoop-common-3.2.2-tests.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\hadoop-common-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\hadoop-kms-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\common\hadoop-nfs-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\accessors-smart-1.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\animal-sniffer-annotations-1.17.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\asm-5.0.4.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\audience-annotations-0.5.0.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\avro-1.7.7.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\checker-qual-2.5.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\commons-beanutils-1.9.4.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\commons-cli-1.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\commons-codec-1.11.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\commons-collections-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\commons-compress-1.19.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\commons-configuration2-2.1.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\commons-daemon-1.0.13.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\commons-io-2.5.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\commons-lang3-3.7.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\commons-logging-1.1.3.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\commons-math3-3.1.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\commons-net-3.6.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\commons-text-1.4.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\curator-client-2.13.0.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\curator-framework-2.13.0.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\curator-recipes-2.13.0.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\dnsjava-2.1.7.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\error_prone_annotations-2.2.0.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\failureaccess-1.0.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\gson-2.2.4.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\guava-27.0-jre.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\hadoop-annotations-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\hadoop-auth-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\htrace-core4-4.1.0-incubating.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\httpclient-4.5.13.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\httpcore-4.4.13.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\j2objc-annotations-1.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\jackson-annotations-2.9.10.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\jackson-core-2.9.10.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\jackson-core-asl-1.9.13.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\jackson-databind-2.9.10.4.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\jackson-jaxrs-1.9.13.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\jackson-mapper-asl-1.9.13.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\jackson-xc-1.9.13.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\javax.activation-api-1.2.0.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\javax.servlet-api-3.1.0.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\jaxb-api-2.2.11.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\jaxb-impl-2.2.3-1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\jcip-annotations-1.0-1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\jersey-core-1.19.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\jersey-json-1.19.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\jersey-server-1.19.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\jersey-servlet-1.19.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\jettison-1.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\jetty-http-9.4.20.v20190813.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\jetty-io-9.4.20.v20190813.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\jetty-security-9.4.20.v20190813.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\jetty-server-9.4.20.v20190813.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\jetty-servlet-9.4.20.v20190813.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\jetty-util-9.4.20.v20190813.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\jetty-util-ajax-9.4.20.v20190813.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\jetty-webapp-9.4.20.v20190813.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\jetty-xml-9.4.20.v20190813.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\jsch-0.1.55.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\json-simple-1.1.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\json-smart-2.3.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\jsr305-3.0.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\jsr311-api-1.1.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\kerb-admin-1.0.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\kerb-client-1.0.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\kerb-common-1.0.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\kerb-core-1.0.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\kerb-crypto-1.0.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\kerb-identity-1.0.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\kerb-server-1.0.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\kerb-simplekdc-1.0.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\kerb-util-1.0.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\kerby-asn1-1.0.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\kerby-config-1.0.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\kerby-pkix-1.0.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\kerby-util-1.0.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\kerby-xdr-1.0.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\leveldbjni-all-1.8.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\log4j-1.2.17.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\netty-3.10.6.Final.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\netty-all-4.1.48.Final.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\nimbus-jose-jwt-7.9.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\okhttp-2.7.5.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\okio-1.6.0.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\paranamer-2.3.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\protobuf-java-2.5.0.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\re2j-1.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\snappy-java-1.0.5.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\stax2-api-3.1.4.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\token-provider-1.0.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\woodstox-core-5.0.3.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\lib\zookeeper-3.4.13.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\hadoop-hdfs-3.2.2-tests.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\hadoop-hdfs-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\hadoop-hdfs-client-3.2.2-tests.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\hadoop-hdfs-client-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\hadoop-hdfs-httpfs-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\hadoop-hdfs-native-client-3.2.2-tests.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\hadoop-hdfs-native-client-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\hadoop-hdfs-nfs-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\hadoop-hdfs-rbf-3.2.2-tests.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\hdfs\hadoop-hdfs-rbf-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\lib\aopalliance-1.0.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\lib\bcpkix-jdk15on-1.60.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\lib\bcprov-jdk15on-1.60.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\lib\ehcache-3.3.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\lib\fst-2.50.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\lib\geronimo-jcache_1.0_spec-1.0-alpha-1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\lib\guice-4.0.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\lib\guice-servlet-4.0.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\lib\HikariCP-java7-2.4.12.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\lib\jackson-jaxrs-base-2.9.10.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\lib\jackson-jaxrs-json-provider-2.9.10.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\lib\jackson-module-jaxb-annotations-2.9.10.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\lib\java-util-1.9.0.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\lib\javax.inject-1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\lib\jersey-client-1.19.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\lib\jersey-guice-1.19.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\lib\json-io-2.5.1.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\lib\metrics-core-3.2.4.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\lib\mssql-jdbc-6.2.1.jre7.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\lib\objenesis-1.0.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\lib\snakeyaml-1.16.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\lib\swagger-annotations-1.5.4.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\hadoop-yarn-api-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\hadoop-yarn-applications-distributedshell-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\hadoop-yarn-applications-unmanaged-am-launcher-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\hadoop-yarn-client-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\hadoop-yarn-common-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\hadoop-yarn-registry-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\hadoop-yarn-server-applicationhistoryservice-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\hadoop-yarn-server-common-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\hadoop-yarn-server-nodemanager-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\hadoop-yarn-server-resourcemanager-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\hadoop-yarn-server-router-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\hadoop-yarn-server-sharedcachemanager-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\hadoop-yarn-server-tests-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\hadoop-yarn-server-timeline-pluginstorage-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\hadoop-yarn-server-web-proxy-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\hadoop-yarn-services-api-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\hadoop-yarn-services-core-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\yarn\hadoop-yarn-submarine-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\mapreduce\lib\hamcrest-core-1.3.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\mapreduce\lib\junit-4.11.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\mapreduce\hadoop-mapreduce-client-app-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\mapreduce\hadoop-mapreduce-client-common-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\mapreduce\hadoop-mapreduce-client-core-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\mapreduce\hadoop-mapreduce-client-hs-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\mapreduce\hadoop-mapreduce-client-hs-plugins-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\mapreduce\hadoop-mapreduce-client-jobclient-3.2.2-tests.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\mapreduce\hadoop-mapreduce-client-jobclient-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\mapreduce\hadoop-mapreduce-client-nativetask-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\mapreduce\hadoop-mapreduce-client-shuffle-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\mapreduce\hadoop-mapreduce-client-uploader-3.2.2.jar;D:\pyspark\Hadoop\hadoop-3.2.2\share\hadoop\mapreduce\hadoop-mapreduce-examples-3.2.2.jar STARTUP_MSG: build = Unknown -r 7a3bc90b05f257c8ace2f76d74264906f0f7a932; compiled by 'hexiaoqiao' on 2021-01-03T09:26Z STARTUP_MSG: java = 1.8.0_281 ************************************************************/ 2025-06-18 16:50:45,335 INFO checker.ThrottledAsyncChecker: Scheduling a check for [DISK]file:/D:/hadoop-3.2.2/data/datanode 2025-06-18 16:50:45,420 INFO impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2025-06-18 16:50:45,483 INFO impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2025-06-18 16:50:45,484 INFO impl.MetricsSystemImpl: DataNode metrics system started 2025-06-18 16:50:46,677 INFO common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-06-18 16:50:46,689 INFO datanode.BlockScanner: Initialized block scanner with targetBytesPerSec 1048576 2025-06-18 16:50:46,692 INFO datanode.DataNode: Configured hostname is LAPTOP-FK5QKFGQ 2025-06-18 16:50:46,693 INFO common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-06-18 16:50:46,695 INFO datanode.DataNode: Starting DataNode with maxLockedMemory = 0 2025-06-18 16:50:46,709 INFO datanode.DataNode: Opened streaming server at /0.0.0.0:9866 2025-06-18 16:50:46,710 INFO datanode.DataNode: Balancing bandwidth is 10485760 bytes/s 2025-06-18 16:50:46,710 INFO datanode.DataNode: Number threads for balancing is 50 2025-06-18 16:50:46,741 INFO util.log: Logging initialized @7589ms to org.eclipse.jetty.util.log.Slf4jLog 2025-06-18 16:50:51,787 INFO server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. 2025-06-18 16:50:51,821 INFO http.HttpRequestLog: Http request log for http.requests.datanode is not defined 2025-06-18 16:50:51,828 INFO http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2025-06-18 16:50:51,829 INFO http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context datanode 2025-06-18 16:50:51,829 INFO http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs 2025-06-18 16:50:51,830 INFO http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static 2025-06-18 16:50:51,848 INFO http.HttpServer2: Jetty bound to port 38751 2025-06-18 16:50:51,849 INFO server.Server: jetty-9.4.20.v20190813; built: 2019-08-13T21:28:18.144Z; git: 84700530e645e812b336747464d6fbbf370c9a20; jvm 1.8.0_281-b09 2025-06-18 16:50:51,865 INFO server.session: DefaultSessionIdManager workerName=node0 2025-06-18 16:50:51,865 INFO server.session: No SessionScavenger set, using defaults 2025-06-18 16:50:51,867 INFO server.session: node0 Scavenging every 660000ms 2025-06-18 16:50:51,874 INFO handler.ContextHandler: Started o.e.j.s.ServletContextHandler@2421cc4{logs,/logs,file:///D:/pyspark/Hadoop/hadoop-3.2.2/logs/,AVAILABLE} 2025-06-18 16:50:51,874 INFO handler.ContextHandler: Started o.e.j.s.ServletContextHandler@21ba0741{static,/static,file:///D:/pyspark/Hadoop/hadoop-3.2.2/share/hadoop/hdfs/webapps/static/,AVAILABLE} 2025-06-18 16:50:51,926 INFO util.TypeUtil: JVM Runtime does not support Modules 2025-06-18 16:50:51,932 INFO handler.ContextHandler: Started o.e.j.w.WebAppContext@43f82e78{datanode,/,file:///D:/pyspark/Hadoop/hadoop-3.2.2/share/hadoop/hdfs/webapps/datanode/,AVAILABLE}{file:/D:/pyspark/Hadoop/hadoop-3.2.2/share/hadoop/hdfs/webapps/datanode} 2025-06-18 16:50:51,939 INFO server.AbstractConnector: Started ServerConnector@1e097d59{HTTP/1.1,[http/1.1]}{localhost:38751} 2025-06-18 16:50:51,940 INFO server.Server: Started @12789ms 2025-06-18 16:50:52,540 INFO web.DatanodeHttpServer: Listening HTTP traffic on /0.0.0.0:9864 2025-06-18 16:50:52,545 INFO util.JvmPauseMonitor: Starting JVM pause monitor 2025-06-18 16:50:52,545 INFO datanode.DataNode: dnUserName = aaa 2025-06-18 16:50:52,546 INFO datanode.DataNode: supergroup = supergroup 2025-06-18 16:50:52,573 INFO ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue, queueCapacity: 1000, scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler, ipcBackoff: false. 2025-06-18 16:50:52,583 INFO ipc.Server: Starting Socket Reader #1 for port 9867 2025-06-18 16:50:52,720 INFO datanode.DataNode: Opened IPC server at /0.0.0.0:9867 2025-06-18 16:50:52,729 INFO datanode.DataNode: Refresh request received for nameservices: null 2025-06-18 16:50:52,735 INFO datanode.DataNode: Starting BPOfferServices for nameservices: <default> 2025-06-18 16:50:52,740 INFO datanode.DataNode: Block pool <registering> (Datanode Uuid unassigned) service to localhost/127.0.0.1:9000 starting to offer service 2025-06-18 16:50:52,745 INFO ipc.Server: IPC Server Responder: starting 2025-06-18 16:50:52,745 INFO ipc.Server: IPC Server listener on 9867: starting 2025-06-18 16:50:52,954 INFO datanode.DataNode: Acknowledging ACTIVE Namenode during handshakeBlock pool <registering> (Datanode Uuid unassigned) service to localhost/127.0.0.1:9000 2025-06-18 16:50:52,956 INFO common.Storage: Using 1 threads to upgrade data directories (dfs.datanode.parallel.volumes.load.threads.num=1, dataDirs=1) 2025-06-18 16:50:52,965 INFO common.Storage: Lock on D:\hadoop-3.2.2\data\datanode\in_use.lock acquired by nodename 16808@LAPTOP-FK5QKFGQ 2025-06-18 16:50:52,970 WARN common.Storage: Failed to add storage directory [DISK]file:/D:/hadoop-3.2.2/data/datanode java.io.IOException: Incompatible clusterIDs in D:\hadoop-3.2.2\data\datanode: namenode clusterID = CID-0243def2-304c-4ffd-871c-57b2cdf0182f; datanode clusterID = CID-a6ff55fc-9daf-4605-8a53-edaae5a9f8de at org.apache.hadoop.hdfs.server.datanode.DataStorage.doTransition(DataStorage.java:744) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadStorageDirectory(DataStorage.java:294) at org.apache.hadoop.hdfs.server.datanode.DataStorage.loadDataStorage(DataStorage.java:407) at org.apache.hadoop.hdfs.server.datanode.DataStorage.addStorageLocations(DataStorage.java:387) at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:559) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1748) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1684) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:392) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:282) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:829) at java.lang.Thread.run(Thread.java:748) 2025-06-18 16:50:52,973 ERROR datanode.DataNode: Initialization failed for Block pool <registering> (Datanode Uuid cd899db0-fd95-4996-8250-261d1d36dbda) service to localhost/127.0.0.1:9000. Exiting. java.io.IOException: All specified directories have failed to load. at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:560) at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1748) at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1684) at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:392) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:282) at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:829) at java.lang.Thread.run(Thread.java:748) 2025-06-18 16:50:52,973 WARN datanode.DataNode: Ending block pool service for: Block pool <registering> (Datanode Uuid cd899db0-fd95-4996-8250-261d1d36dbda) service to localhost/127.0.0.1:9000 2025-06-18 16:50:52,974 INFO datanode.DataNode: Removed Block pool <registering> (Datanode Uuid cd899db0-fd95-4996-8250-261d1d36dbda) 2025-06-18 16:50:54,974 WARN datanode.DataNode: Exiting Datanode 2025-06-18 16:50:54,976 INFO datanode.DataNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down DataNode at LAPTOP-FK5QKFGQ/192.168.10.1 ************************************************************/
06-19
运行后报错 D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\bin\java.exe "-javaagent:D:\2025.9\JAVA\IDEA 2023.1\IntelliJ IDEA 2023.1\lib\idea_rt.jar=59681:D:\2025.9\JAVA\IDEA 2023.1\IntelliJ IDEA 2023.1\bin" -Dfile.encoding=UTF-8 -classpath D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\charsets.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\deploy.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\ext\access-bridge-64.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\ext\cldrdata.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\ext\dnsns.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\ext\jaccess.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\ext\jfxrt.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\ext\localedata.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\ext\nashorn.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\ext\sunec.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\ext\sunjce_provider.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\ext\sunmscapi.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\ext\sunpkcs11.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\ext\zipfs.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\javaws.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\jce.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\jfr.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\jfxswt.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\jsse.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\management-agent.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\plugin.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\resources.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\rt.jar;D:\2025.9\Hadoop\java\hadoop_project\out\production\hadoop_project;D:\2025.9\Hadoop\test\4\lib2\fst-2.50.jar;D:\2025.9\Hadoop\test\4\lib2\re2j-1.1.jar;D:\2025.9\Hadoop\test\4\lib2\asm-5.0.4.jar;D:\2025.9\Hadoop\test\4\lib2\guice-4.0.jar;D:\2025.9\Hadoop\test\4\lib2\jna-5.2.0.jar;D:\2025.9\Hadoop\test\4\lib2\avro-1.7.7.jar;D:\2025.9\Hadoop\test\4\lib2\gson-2.8.9.jar;D:\2025.9\Hadoop\test\4\lib2\okio-2.8.0.jar;D:\2025.9\Hadoop\test\4\lib2\jline-3.9.0.jar;D:\2025.9\Hadoop\test\4\lib2\jsch-0.1.55.jar;D:\2025.9\Hadoop\test\4\lib2\asm-tree-9.1.jar;D:\2025.9\Hadoop\test\4\lib2\jettison-1.1.jar;D:\2025.9\Hadoop\test\4\lib2\jsr305-3.0.2.jar;D:\2025.9\Hadoop\test\4\lib2\log4j-1.2.17.jar;D:\2025.9\Hadoop\test\4\lib2\okhttp-4.9.3.jar;D:\2025.9\Hadoop\test\4\lib2\dnsjava-2.1.7.jar;D:\2025.9\Hadoop\test\4\lib2\ehcache-3.3.1.jar;D:\2025.9\Hadoop\test\4\lib2\json-io-2.5.1.jar;D:\2025.9\Hadoop\test\4\lib2\objenesis-2.6.jar;D:\2025.9\Hadoop\test\4\lib2\paranamer-2.3.jar;D:\2025.9\Hadoop\test\4\lib2\guava-27.0-jre.jar;D:\2025.9\Hadoop\test\4\lib2\javax.inject-1.jar;D:\2025.9\Hadoop\test\4\lib2\snakeyaml-1.26.jar;D:\2025.9\Hadoop\test\4\lib2\aopalliance-1.0.jar;D:\2025.9\Hadoop\test\4\lib2\asm-commons-9.1.jar;D:\2025.9\Hadoop\test\4\lib2\commons-cli-1.2.jar;D:\2025.9\Hadoop\test\4\lib2\commons-net-3.6.jar;D:\2025.9\Hadoop\test\4\lib2\httpcore-4.4.13.jar;D:\2025.9\Hadoop\test\4\lib2\java-util-1.9.0.jar;D:\2025.9\Hadoop\test\4\lib2\jaxb-api-2.2.11.jar;D:\2025.9\Hadoop\test\4\lib2\kerb-core-1.0.1.jar;D:\2025.9\Hadoop\test\4\lib2\kerb-util-1.0.1.jar;D:\2025.9\Hadoop\test\4\lib2\kerby-xdr-1.0.1.jar;D:\2025.9\Hadoop\test\4\lib2\reload4j-1.2.22.jar;D:\2025.9\Hadoop\test\4\lib2\stax2-api-4.2.1.jar;D:\2025.9\Hadoop\test\4\lib2\zookeeper-3.5.6.jar;D:\2025.9\Hadoop\test\4\lib2\asm-analysis-9.1.jar;D:\2025.9\Hadoop\test\4\lib2\commons-io-2.8.0.jar;D:\2025.9\Hadoop\test\4\lib2\commons-text-1.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-kms-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-nfs-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\jersey-core-1.19.jar;D:\2025.9\Hadoop\test\4\lib2\jersey-json-1.19.jar;D:\2025.9\Hadoop\test\4\lib2\json-smart-2.4.7.jar;D:\2025.9\Hadoop\test\4\lib2\jsr311-api-1.1.1.jar;D:\2025.9\Hadoop\test\4\lib2\kerb-admin-1.0.1.jar;D:\2025.9\Hadoop\test\4\lib2\kerby-asn1-1.0.1.jar;D:\2025.9\Hadoop\test\4\lib2\kerby-pkix-1.0.1.jar;D:\2025.9\Hadoop\test\4\lib2\kerby-util-1.0.1.jar;D:\2025.9\Hadoop\test\4\lib2\slf4j-api-1.7.36.jar;D:\2025.9\Hadoop\test\4\lib2\failureaccess-1.0.jar;D:\2025.9\Hadoop\test\4\lib2\guice-servlet-4.0.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-auth-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-hdfs-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\httpclient-4.5.13.jar;D:\2025.9\Hadoop\test\4\lib2\jackson-xc-1.9.13.jar;D:\2025.9\Hadoop\test\4\lib2\jaxb-impl-2.2.3-1.jar;D:\2025.9\Hadoop\test\4\lib2\jersey-guice-1.19.jar;D:\2025.9\Hadoop\test\4\lib2\json-simple-1.1.1.jar;D:\2025.9\Hadoop\test\4\lib2\kerb-client-1.0.1.jar;D:\2025.9\Hadoop\test\4\lib2\kerb-common-1.0.1.jar;D:\2025.9\Hadoop\test\4\lib2\kerb-crypto-1.0.1.jar;D:\2025.9\Hadoop\test\4\lib2\kerb-server-1.0.1.jar;D:\2025.9\Hadoop\test\4\lib2\checker-qual-2.5.2.jar;D:\2025.9\Hadoop\test\4\lib2\commons-codec-1.15.jar;D:\2025.9\Hadoop\test\4\lib2\jersey-client-1.19.jar;D:\2025.9\Hadoop\test\4\lib2\jersey-server-1.19.jar;D:\2025.9\Hadoop\test\4\lib2\kerby-config-1.0.1.jar;D:\2025.9\Hadoop\test\4\lib2\leveldbjni-all-1.8.jar;D:\2025.9\Hadoop\test\4\lib2\metrics-core-3.2.4.jar;D:\2025.9\Hadoop\test\4\lib2\netty-3.10.6.Final.jar;D:\2025.9\Hadoop\test\4\lib2\bcpkix-jdk15on-1.60.jar;D:\2025.9\Hadoop\test\4\lib2\bcprov-jdk15on-1.60.jar;D:\2025.9\Hadoop\test\4\lib2\commons-math3-3.1.1.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-common-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\jackson-core-2.12.7.jar;D:\2025.9\Hadoop\test\4\lib2\jersey-servlet-1.19.jar;D:\2025.9\Hadoop\test\4\lib2\kerb-identity-1.0.1.jar;D:\2025.9\Hadoop\test\4\lib2\protobuf-java-2.5.0.jar;D:\2025.9\Hadoop\test\4\lib2\snappy-java-1.1.8.2.jar;D:\2025.9\Hadoop\test\4\lib2\woodstox-core-5.3.0.jar;D:\2025.9\Hadoop\test\4\lib2\commons-lang3-3.12.0.jar;D:\2025.9\Hadoop\test\4\lib2\curator-client-4.2.0.jar;D:\2025.9\Hadoop\test\4\lib2\jackson-jaxrs-1.9.13.jar;D:\2025.9\Hadoop\test\4\lib2\kerb-simplekdc-1.0.1.jar;D:\2025.9\Hadoop\test\4\lib2\kotlin-stdlib-1.4.10.jar;D:\2025.9\Hadoop\test\4\lib2\token-provider-1.0.1.jar;D:\2025.9\Hadoop\test\4\lib2\zookeeper-jute-3.5.6.jar;D:\2025.9\Hadoop\test\4\lib2\accessors-smart-2.4.7.jar;D:\2025.9\Hadoop\test\4\lib2\commons-compress-1.21.jar;D:\2025.9\Hadoop\test\4\lib2\commons-daemon-1.0.13.jar;D:\2025.9\Hadoop\test\4\lib2\commons-logging-1.1.3.jar;D:\2025.9\Hadoop\test\4\lib2\curator-recipes-4.2.0.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-hdfs-nfs-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-hdfs-rbf-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-registry-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-api-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\HikariCP-java7-2.4.12.jar;D:\2025.9\Hadoop\test\4\lib2\mssql-jdbc-6.2.1.jre7.jar;D:\2025.9\Hadoop\test\4\lib2\nimbus-jose-jwt-9.8.1.jar;D:\2025.9\Hadoop\test\4\lib2\slf4j-reload4j-1.7.36.jar;D:\2025.9\Hadoop\test\4\lib2\j2objc-annotations-1.1.jar;D:\2025.9\Hadoop\test\4\lib2\jcip-annotations-1.0-1.jar;D:\2025.9\Hadoop\test\4\lib2\netty-all-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\commons-beanutils-1.9.4.jar;D:\2025.9\Hadoop\test\4\lib2\curator-framework-4.2.0.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-hdfs-3.3.4-tests.jar;D:\2025.9\Hadoop\test\4\lib2\jackson-core-asl-1.9.13.jar;D:\2025.9\Hadoop\test\4\lib2\jackson-databind-2.12.7.jar;D:\2025.9\Hadoop\test\4\lib2\javax.servlet-api-3.1.0.jar;D:\2025.9\Hadoop\test\4\lib2\javax.websocket-api-1.0.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-annotations-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-hdfs-client-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-hdfs-httpfs-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-client-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-common-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\netty-codec-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\commons-collections-3.2.2.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-common-3.3.4-tests.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-shaded-guava-1.1.1.jar;D:\2025.9\Hadoop\test\4\lib2\jackson-jaxrs-base-2.12.7.jar;D:\2025.9\Hadoop\test\4\lib2\jackson-mapper-asl-1.9.13.jar;D:\2025.9\Hadoop\test\4\lib2\jetty-io-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\netty-buffer-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\netty-common-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\swagger-annotations-1.5.4.jar;D:\2025.9\Hadoop\test\4\lib2\audience-annotations-0.5.0.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-registry-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\jackson-annotations-2.12.7.jar;D:\2025.9\Hadoop\test\4\lib2\jakarta.xml.bind-api-2.3.2.jar;D:\2025.9\Hadoop\test\4\lib2\jetty-xml-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\netty-handler-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-hdfs-rbf-3.3.4-tests.jar;D:\2025.9\Hadoop\test\4\lib2\jetty-http-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\jetty-jndi-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\jetty-plus-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\jetty-util-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\kotlin-stdlib-common-1.4.10.jar;D:\2025.9\Hadoop\test\4\lib2\netty-resolver-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\commons-configuration2-2.1.1.jar;D:\2025.9\Hadoop\test\4\lib2\jakarta.activation-api-1.2.1.jar;D:\2025.9\Hadoop\test\4\lib2\netty-codec-dns-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\netty-codec-xml-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\netty-transport-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\jetty-client-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\jetty-server-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\jetty-webapp-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\netty-codec-http-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\netty-codec-mqtt-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\netty-codec-smtp-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-hdfs-client-3.3.4-tests.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-server-tests-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-services-api-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\javax.websocket-client-api-1.0.jar;D:\2025.9\Hadoop\test\4\lib2\jetty-servlet-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\netty-codec-http2-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\netty-codec-redis-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\netty-codec-socks-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\netty-codec-stomp-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\websocket-api-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\animal-sniffer-annotations-1.17.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-hdfs-native-client-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-mapreduce-examples-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-server-common-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-server-router-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-services-core-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\jetty-security-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\netty-resolver-dns-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-mapreduce-client-hs-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-shaded-protobuf_3_7-1.1.1.jar;D:\2025.9\Hadoop\test\4\lib2\jetty-util-ajax-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\netty-codec-haproxy-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\netty-handler-proxy-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\netty-transport-udt-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-mapreduce-client-app-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\netty-codec-memcache-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\netty-transport-rxtx-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\netty-transport-sctp-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\websocket-client-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\websocket-common-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\websocket-server-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-mapreduce-client-core-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-server-web-proxy-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\jackson-jaxrs-json-provider-2.12.7.jar;D:\2025.9\Hadoop\test\4\lib2\jetty-annotations-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\websocket-servlet-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\geronimo-jcache_1.0_spec-1.0-alpha-1.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-mapreduce-client-common-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-server-nodemanager-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-hdfs-native-client-3.3.4-tests.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-mapreduce-client-shuffle-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-mapreduce-client-uploader-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\jackson-module-jaxb-annotations-2.12.7.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-mapreduce-client-jobclient-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-mapreduce-client-hs-plugins-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-mapreduce-client-nativetask-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-applications-mawo-core-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-server-resourcemanager-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\netty-transport-classes-epoll-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-server-sharedcachemanager-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\netty-transport-classes-kqueue-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\javax-websocket-client-impl-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\javax-websocket-server-impl-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-mapreduce-client-jobclient-3.3.4-tests.jar;D:\2025.9\Hadoop\test\4\lib2\netty-resolver-dns-classes-macos-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-applications-distributedshell-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-server-timeline-pluginstorage-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\netty-transport-native-unix-common-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-server-applicationhistoryservice-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-applications-unmanaged-am-launcher-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\netty-transport-native-kqueue-4.1.77.Final-osx-x86_64.jar;D:\2025.9\Hadoop\test\4\lib2\netty-transport-native-epoll-4.1.77.Final-linux-x86_64.jar;D:\2025.9\Hadoop\test\4\lib2\netty-resolver-dns-native-macos-4.1.77.Final-osx-x86_64.jar;D:\2025.9\Hadoop\test\4\lib2\netty-transport-native-kqueue-4.1.77.Final-osx-aarch_64.jar;D:\2025.9\Hadoop\test\4\lib2\netty-transport-native-epoll-4.1.77.Final-linux-aarch_64.jar;D:\2025.9\Hadoop\test\4\lib2\netty-resolver-dns-native-macos-4.1.77.Final-osx-aarch_64.jar;D:\2025.9\Hadoop\test\4\lib2\listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar WordCount 2025-10-28 19:51:14,271 WARN [main] util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(60)) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2025-10-28 19:51:19,981 WARN [main] impl.MetricsConfig (MetricsConfig.java:loadFirst(136)) - Cannot locate configuration: tried hadoop-metrics2-jobtracker.properties,hadoop-metrics2.properties 2025-10-28 19:51:20,043 INFO [main] impl.MetricsSystemImpl (MetricsSystemImpl.java:startTimer(378)) - Scheduled Metric snapshot period at 10 second(s). 2025-10-28 19:51:20,044 INFO [main] impl.MetricsSystemImpl (MetricsSystemImpl.java:start(191)) - JobTracker metrics system started Exception in thread "main" org.apache.hadoop.ipc.RpcException: RPC response exceeds maximum data length at org.apache.hadoop.ipc.Client$IpcStreams.readResponse(Client.java:1936) at org.apache.hadoop.ipc.Client$Connection.receiveRpcResponse(Client.java:1238) at org.apache.hadoop.ipc.Client$Connection.run(Client.java:1134)
最新发布
10-29
评论
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值