hadoop-hdfs启动又自动退出的问题

本文详细记录了在使用Hadoop HDFS时遇到启动后自动退出的问题,通过日志分析发现原因在于配置文件中主机名与域名的不一致,以及DNS解析或/etc/hosts配置的问题。最终,作者通过将配置中的主机名替换为域名,并确保DNS解析正确,成功解决了问题。

hadoop-hdfs启动又自动退出的问题,折腾了我1天时间啊!

日志如下:
2010-05-19 12:47:44,991 INFO  http.HttpServer - Version Jetty/5.1.4
2010-05-19 12:47:44,999 INFO  util.Credential - Checking Resource aliases
2010-05-19 12:47:45,405 INFO  util.Container - Started org.mortbay.jetty.servlet.WebApplicationHandler@49d67c
2010-05-19 12:47:45,440 INFO  util.Container - Started WebApplicationContext[/static,/static]
2010-05-19 12:47:45,509 INFO  util.Container - Started org.mortbay.jetty.servlet.WebApplicationHandler@8238f4
2010-05-19 12:47:45,510 INFO  util.Container - Started WebApplicationContext[/logs,/logs]
2010-05-19 12:47:45,593 INFO  util.Container - Started org.mortbay.jetty.servlet.WebApplicationHandler@110c31
2010-05-19 12:47:45,596 INFO  util.Container - Started WebApplicationContext[/,/]
2010-05-19 12:47:45,598 INFO  http.SocketListener - Started SocketListener on 0.0.0.0:50070
2010-05-19 12:47:45,599 INFO  util.Container - Started org.mortbay.jetty.Server@e91f5d
2010-05-19 12:47:45,615 INFO  util.ThreadedServer - Stopping Acceptor ServerSocket[addr=0.0.0.0/0.0.0.0,port=0,localport=50070]
2010-05-19 12:47:45,616 INFO  http.SocketListener - Stopped SocketListener on 0.0.0.0:50070
2010-05-19 12:47:45,616 INFO  util.Container - Stopped org.mortbay.jetty.servlet.WebApplicationHandler@49d67c
2010-05-19 12:47:45,675 INFO  util.Container - Stopped WebApplicationContext[/static,/static]
2010-05-19 12:47:45,676 INFO  util.Container - Stopped org.mortbay.jetty.servlet.WebApplicationHandler@8238f4
2010-05-19 12:47:45,729 INFO  util.Container - Stopped WebApplicationContext[/logs,/logs]
2010-05-19 12:47:45,729 INFO  util.Container - Stopped org.mortbay.jetty.servlet.WebApplicationHandler@110c31
2010-05-19 12:47:45,778 INFO  util.Container - Stopped WebApplicationContext[/,/]
2010-05-19 12:47:45,778 INFO  util.Container - Stopped org.mortbay.jetty.Server@e91f5d
2010-05-19 12:47:45,779 WARN  namenode.FSNamesystem - ReplicationMonitor thread received InterruptedException.java.lang.InterruptedException: sleep interrupted

很莫名其妙的问题,据说是因为双IP的问题,LINUX下要禁止IPV6,可我觉得这个问题开发团队肯定是注意了的,问题不在这里。

经过1天甚至还多的时间折腾,我发现下面的规律:

1、namenode第一次启动,日志显示正常,但是bin/hadoop fs -put 会报错,就是常见的:
DataStreamer Exception: org.apache.hadoop.ipc.RemoteException: java.io.IOException:
could only be replicated to 0 nodes, instead of 1

这个问题也可能是防火墙的问题,前面也遇到过。

2、namenode第二次启动(或之后),日志显示就不正常了。就是上面贴出来的,正常启动又自己自动关闭。

3、经过反复测试,发现问题在这里

3.1、每台机器都有名称,job运行的时候会根据主机名获取地址,所以要做DNS解析或者在/etc/hosts里面自己写上。(这个总结题外话,也不知道是否一定准确)

3.2、就是本篇文章要讲的问题:
<property>
        <name>fs.default.name</name>
        <value>hdfs://home0.hadoop:9000</value>
</property>

<property>
        <name>mapred.job.tracker</name>
        <value>home0.hadoop:9001</value>
</property>
这里的home0.hadoop,不要写主机名称,得是域名。经过DNS或者hosts解析的域名。

3.3另外masters里面,我现在也写成了home0.hadoop,没有测试和这个问题是否有关系。

我在配置虚拟机Hadoop,目前遇到问题了,帮我分析一下。[root@master bin]# su ./start-dfs.sh su: user ./start-dfs.sh does not exist [root@master bin]# hdfs namenode -format WARNING: /export/server/hadoop/logs does not exist. Creating. 2025-10-14 03:15:33,652 INFO namenode.NameNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting NameNode STARTUP_MSG: host = master/192.168.80.130 STARTUP_MSG: args = [-format] STARTUP_MSG: version = 3.1.3 STARTUP_MSG: classpath = /export/server/hadoop/etc/hadoop:/export/server/hadoop/share/hadoop/common/lib/accessors-smart-1.2.jar:/export/server/hadoop/share/hadoop/common/lib/animal-sniffer-annotations-1.17.jar:/export/server/hadoop/share/hadoop/common/lib/asm-5.0.4.jar:/export/server/hadoop/share/hadoop/common/lib/audience-annotations-0.5.0.jar:/export/server/hadoop/share/hadoop/common/lib/avro-1.7.7.jar:/export/server/hadoop/share/hadoop/common/lib/checker-qual-2.5.2.jar:/export/server/hadoop/share/hadoop/common/lib/commons-beanutils-1.9.3.jar:/export/server/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/export/server/hadoop/share/hadoop/common/lib/commons-codec-1.11.jar:/export/server/hadoop/share/hadoop/common/lib/commons-collections-3.2.2.jar:/export/server/hadoop/share/hadoop/common/lib/commons-compress-1.18.jar:/export/server/hadoop/share/hadoop/common/lib/commons-configuration2-2.1.1.jar:/export/server/hadoop/share/hadoop/common/lib/commons-io-2.5.jar:/export/server/hadoop/share/hadoop/common/lib/commons-lang-2.6.jar:/export/server/hadoop/share/hadoop/common/lib/commons-lang3-3.4.jar:/export/server/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/export/server/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/export/server/hadoop/share/hadoop/common/lib/commons-net-3.6.jar:/export/server/hadoop/share/hadoop/common/lib/curator-client-2.13.0.jar:/export/server/hadoop/share/hadoop/common/lib/curator-framework-2.13.0.jar:/export/server/hadoop/share/hadoop/common/lib/curator-recipes-2.13.0.jar:/export/server/hadoop/share/hadoop/common/lib/error_prone_annotations-2.2.0.jar:/export/server/hadoop/share/hadoop/common/lib/failureaccess-1.0.jar:/export/server/hadoop/share/hadoop/common/lib/gson-2.2.4.jar:/export/server/hadoop/share/hadoop/common/lib/guava-27.0-jre.jar:/export/server/hadoop/share/hadoop/common/lib/hadoop-annotations-3.1.3.jar:/export/server/hadoop/share/hadoop/common/lib/hadoop-auth-3.1.3.jar:/export/server/hadoop/share/hadoop/common/lib/htrace-core4-4.1.0-incubating.jar:/export/server/hadoop/share/hadoop/common/lib/httpclient-4.5.2.jar:/export/server/hadoop/share/hadoop/common/lib/httpcore-4.4.4.jar:/export/server/hadoop/share/hadoop/common/lib/j2objc-annotations-1.1.jar:/export/server/hadoop/share/hadoop/common/lib/jackson-annotations-2.7.8.jar:/export/server/hadoop/share/hadoop/common/lib/jackson-core-2.7.8.jar:/export/server/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/export/server/hadoop/share/hadoop/common/lib/jackson-databind-2.7.8.jar:/export/server/hadoop/share/hadoop/common/lib/jackson-jaxrs-1.9.13.jar:/export/server/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/export/server/hadoop/share/hadoop/common/lib/jackson-xc-1.9.13.jar:/export/server/hadoop/share/hadoop/common/lib/javax.servlet-api-3.1.0.jar:/export/server/hadoop/share/hadoop/common/lib/jaxb-api-2.2.11.jar:/export/server/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/export/server/hadoop/share/hadoop/common/lib/jcip-annotations-1.0-1.jar:/export/server/hadoop/share/hadoop/common/lib/jersey-core-1.19.jar:/export/server/hadoop/share/hadoop/common/lib/jersey-json-1.19.jar:/export/server/hadoop/share/hadoop/common/lib/jersey-server-1.19.jar:/export/server/hadoop/share/hadoop/common/lib/jersey-servlet-1.19.jar:/export/server/hadoop/share/hadoop/common/lib/jettison-1.1.jar:/export/server/hadoop/share/hadoop/common/lib/jetty-http-9.3.24.v20180605.jar:/export/server/hadoop/share/hadoop/common/lib/jetty-io-9.3.24.v20180605.jar:/export/server/hadoop/share/hadoop/common/lib/jetty-security-9.3.24.v20180605.jar:/export/server/hadoop/share/hadoop/common/lib/jetty-server-9.3.24.v20180605.jar:/export/server/hadoop/share/hadoop/common/lib/jetty-servlet-9.3.24.v20180605.jar:/export/server/hadoop/share/hadoop/common/lib/jetty-util-9.3.24.v20180605.jar:/export/server/hadoop/share/hadoop/common/lib/jetty-webapp-9.3.24.v20180605.jar:/export/server/hadoop/share/hadoop/common/lib/jetty-xml-9.3.24.v20180605.jar:/export/server/hadoop/share/hadoop/common/lib/jsch-0.1.54.jar:/export/server/hadoop/share/hadoop/common/lib/json-smart-2.3.jar:/export/server/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/export/server/hadoop/share/hadoop/common/lib/jsr305-3.0.0.jar:/export/server/hadoop/share/hadoop/common/lib/jsr311-api-1.1.1.jar:/export/server/hadoop/share/hadoop/common/lib/kerb-admin-1.0.1.jar:/export/server/hadoop/share/hadoop/common/lib/kerb-client-1.0.1.jar:/export/server/hadoop/share/hadoop/common/lib/kerb-common-1.0.1.jar:/export/server/hadoop/share/hadoop/common/lib/kerb-core-1.0.1.jar:/export/server/hadoop/share/hadoop/common/lib/kerb-crypto-1.0.1.jar:/export/server/hadoop/share/hadoop/common/lib/kerb-identity-1.0.1.jar:/export/server/hadoop/share/hadoop/common/lib/kerb-server-1.0.1.jar:/export/server/hadoop/share/hadoop/common/lib/kerb-simplekdc-1.0.1.jar:/export/server/hadoop/share/hadoop/common/lib/kerb-util-1.0.1.jar:/export/server/hadoop/share/hadoop/common/lib/kerby-asn1-1.0.1.jar:/export/server/hadoop/share/hadoop/common/lib/kerby-config-1.0.1.jar:/export/server/hadoop/share/hadoop/common/lib/kerby-pkix-1.0.1.jar:/export/server/hadoop/share/hadoop/common/lib/kerby-util-1.0.1.jar:/export/server/hadoop/share/hadoop/common/lib/kerby-xdr-1.0.1.jar:/export/server/hadoop/share/hadoop/common/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/export/server/hadoop/share/hadoop/common/lib/log4j-1.2.17.jar:/export/server/hadoop/share/hadoop/common/lib/netty-3.10.5.Final.jar:/export/server/hadoop/share/hadoop/common/lib/nimbus-jose-jwt-4.41.1.jar:/export/server/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/export/server/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/export/server/hadoop/share/hadoop/common/lib/re2j-1.1.jar:/export/server/hadoop/share/hadoop/common/lib/slf4j-api-1.7.25.jar:/export/server/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar:/export/server/hadoop/share/hadoop/common/lib/snappy-java-1.0.5.jar:/export/server/hadoop/share/hadoop/common/lib/stax2-api-3.1.4.jar:/export/server/hadoop/share/hadoop/common/lib/token-provider-1.0.1.jar:/export/server/hadoop/share/hadoop/common/lib/woodstox-core-5.0.3.jar:/export/server/hadoop/share/hadoop/common/lib/zookeeper-3.4.13.jar:/export/server/hadoop/share/hadoop/common/lib/jul-to-slf4j-1.7.25.jar:/export/server/hadoop/share/hadoop/common/lib/metrics-core-3.2.4.jar:/export/server/hadoop/share/hadoop/common/hadoop-common-3.1.3-tests.jar:/export/server/hadoop/share/hadoop/common/hadoop-common-3.1.3.jar:/export/server/hadoop/share/hadoop/common/hadoop-nfs-3.1.3.jar:/export/server/hadoop/share/hadoop/common/hadoop-kms-3.1.3.jar:/export/server/hadoop/share/hadoop/hdfs:/export/server/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/export/server/hadoop/share/hadoop/hdfs/lib/jetty-util-ajax-9.3.24.v20180605.jar:/export/server/hadoop/share/hadoop/hdfs/lib/leveldbjni-all-1.8.jar:/export/server/hadoop/share/hadoop/hdfs/lib/netty-all-4.0.52.Final.jar:/export/server/hadoop/share/hadoop/hdfs/lib/okhttp-2.7.5.jar:/export/server/hadoop/share/hadoop/hdfs/lib/okio-1.6.0.jar:/export/server/hadoop/share/hadoop/hdfs/lib/jersey-servlet-1.19.jar:/export/server/hadoop/share/hadoop/hdfs/lib/jersey-json-1.19.jar:/export/server/hadoop/share/hadoop/hdfs/lib/hadoop-auth-3.1.3.jar:/export/server/hadoop/share/hadoop/hdfs/lib/commons-codec-1.11.jar:/export/server/hadoop/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/export/server/hadoop/share/hadoop/hdfs/lib/httpclient-4.5.2.jar:/export/server/hadoop/share/hadoop/hdfs/lib/httpcore-4.4.4.jar:/export/server/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/export/server/hadoop/share/hadoop/hdfs/lib/nimbus-jose-jwt-4.41.1.jar:/export/server/hadoop/share/hadoop/hdfs/lib/jcip-annotations-1.0-1.jar:/export/server/hadoop/share/hadoop/hdfs/lib/json-smart-2.3.jar:/export/server/hadoop/share/hadoop/hdfs/lib/accessors-smart-1.2.jar:/export/server/hadoop/share/hadoop/hdfs/lib/asm-5.0.4.jar:/export/server/hadoop/share/hadoop/hdfs/lib/zookeeper-3.4.13.jar:/export/server/hadoop/share/hadoop/hdfs/lib/audience-annotations-0.5.0.jar:/export/server/hadoop/share/hadoop/hdfs/lib/netty-3.10.5.Final.jar:/export/server/hadoop/share/hadoop/hdfs/lib/curator-framework-2.13.0.jar:/export/server/hadoop/share/hadoop/hdfs/lib/curator-client-2.13.0.jar:/export/server/hadoop/share/hadoop/hdfs/lib/guava-27.0-jre.jar:/export/server/hadoop/share/hadoop/hdfs/lib/failureaccess-1.0.jar:/export/server/hadoop/share/hadoop/hdfs/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/export/server/hadoop/share/hadoop/hdfs/lib/jsr305-3.0.0.jar:/export/server/hadoop/share/hadoop/hdfs/lib/checker-qual-2.5.2.jar:/export/server/hadoop/share/hadoop/hdfs/lib/error_prone_annotations-2.2.0.jar:/export/server/hadoop/share/hadoop/hdfs/lib/j2objc-annotations-1.1.jar:/export/server/hadoop/share/hadoop/hdfs/lib/animal-sniffer-annotations-1.17.jar:/export/server/hadoop/share/hadoop/hdfs/lib/kerb-simplekdc-1.0.1.jar:/export/server/hadoop/share/hadoop/hdfs/lib/kerb-client-1.0.1.jar:/export/server/hadoop/share/hadoop/hdfs/lib/kerby-config-1.0.1.jar:/export/server/hadoop/share/hadoop/hdfs/lib/kerb-core-1.0.1.jar:/export/server/hadoop/share/hadoop/hdfs/lib/kerby-pkix-1.0.1.jar:/export/server/hadoop/share/hadoop/hdfs/lib/kerby-asn1-1.0.1.jar:/export/server/hadoop/share/hadoop/hdfs/lib/kerby-util-1.0.1.jar:/export/server/hadoop/share/hadoop/hdfs/lib/kerb-common-1.0.1.jar:/export/server/hadoop/share/hadoop/hdfs/lib/kerb-crypto-1.0.1.jar:/export/server/hadoop/share/hadoop/hdfs/lib/commons-io-2.5.jar:/export/server/hadoop/share/hadoop/hdfs/lib/kerb-util-1.0.1.jar:/export/server/hadoop/share/hadoop/hdfs/lib/token-provider-1.0.1.jar:/export/server/hadoop/share/hadoop/hdfs/lib/kerb-admin-1.0.1.jar:/export/server/hadoop/share/hadoop/hdfs/lib/kerb-server-1.0.1.jar:/export/server/hadoop/share/hadoop/hdfs/lib/kerb-identity-1.0.1.jar:/export/server/hadoop/share/hadoop/hdfs/lib/kerby-xdr-1.0.1.jar:/export/server/hadoop/share/hadoop/hdfs/lib/jersey-core-1.19.jar:/export/server/hadoop/share/hadoop/hdfs/lib/jsr311-api-1.1.1.jar:/export/server/hadoop/share/hadoop/hdfs/lib/jersey-server-1.19.jar:/export/server/hadoop/share/hadoop/hdfs/lib/javax.servlet-api-3.1.0.jar:/export/server/hadoop/share/hadoop/hdfs/lib/json-simple-1.1.1.jar:/export/server/hadoop/share/hadoop/hdfs/lib/jetty-server-9.3.24.v20180605.jar:/export/server/hadoop/share/hadoop/hdfs/lib/jetty-http-9.3.24.v20180605.jar:/export/server/hadoop/share/hadoop/hdfs/lib/jetty-util-9.3.24.v20180605.jar:/export/server/hadoop/share/hadoop/hdfs/lib/jetty-io-9.3.24.v20180605.jar:/export/server/hadoop/share/hadoop/hdfs/lib/jetty-webapp-9.3.24.v20180605.jar:/export/server/hadoop/share/hadoop/hdfs/lib/jetty-xml-9.3.24.v20180605.jar:/export/server/hadoop/share/hadoop/hdfs/lib/jetty-servlet-9.3.24.v20180605.jar:/export/server/hadoop/share/hadoop/hdfs/lib/jetty-security-9.3.24.v20180605.jar:/export/server/hadoop/share/hadoop/hdfs/lib/hadoop-annotations-3.1.3.jar:/export/server/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/export/server/hadoop/share/hadoop/hdfs/lib/commons-math3-3.1.1.jar:/export/server/hadoop/share/hadoop/hdfs/lib/commons-net-3.6.jar:/export/server/hadoop/share/hadoop/hdfs/lib/commons-collections-3.2.2.jar:/export/server/hadoop/share/hadoop/hdfs/lib/jettison-1.1.jar:/export/server/hadoop/share/hadoop/hdfs/lib/jaxb-impl-2.2.3-1.jar:/export/server/hadoop/share/hadoop/hdfs/lib/jaxb-api-2.2.11.jar:/export/server/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/export/server/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/export/server/hadoop/share/hadoop/hdfs/lib/jackson-jaxrs-1.9.13.jar:/export/server/hadoop/share/hadoop/hdfs/lib/jackson-xc-1.9.13.jar:/export/server/hadoop/share/hadoop/hdfs/lib/commons-lang-2.6.jar:/export/server/hadoop/share/hadoop/hdfs/lib/commons-beanutils-1.9.3.jar:/export/server/hadoop/share/hadoop/hdfs/lib/commons-configuration2-2.1.1.jar:/export/server/hadoop/share/hadoop/hdfs/lib/commons-lang3-3.4.jar:/export/server/hadoop/share/hadoop/hdfs/lib/avro-1.7.7.jar:/export/server/hadoop/share/hadoop/hdfs/lib/paranamer-2.3.jar:/export/server/hadoop/share/hadoop/hdfs/lib/snappy-java-1.0.5.jar:/export/server/hadoop/share/hadoop/hdfs/lib/commons-compress-1.18.jar:/export/server/hadoop/share/hadoop/hdfs/lib/re2j-1.1.jar:/export/server/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/export/server/hadoop/share/hadoop/hdfs/lib/gson-2.2.4.jar:/export/server/hadoop/share/hadoop/hdfs/lib/jsch-0.1.54.jar:/export/server/hadoop/share/hadoop/hdfs/lib/curator-recipes-2.13.0.jar:/export/server/hadoop/share/hadoop/hdfs/lib/htrace-core4-4.1.0-incubating.jar:/export/server/hadoop/share/hadoop/hdfs/lib/jackson-databind-2.7.8.jar:/export/server/hadoop/share/hadoop/hdfs/lib/jackson-annotations-2.7.8.jar:/export/server/hadoop/share/hadoop/hdfs/lib/jackson-core-2.7.8.jar:/export/server/hadoop/share/hadoop/hdfs/lib/stax2-api-3.1.4.jar:/export/server/hadoop/share/hadoop/hdfs/lib/woodstox-core-5.0.3.jar:/export/server/hadoop/share/hadoop/hdfs/hadoop-hdfs-3.1.3-tests.jar:/export/server/hadoop/share/hadoop/hdfs/hadoop-hdfs-3.1.3.jar:/export/server/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-3.1.3.jar:/export/server/hadoop/share/hadoop/hdfs/hadoop-hdfs-client-3.1.3-tests.jar:/export/server/hadoop/share/hadoop/hdfs/hadoop-hdfs-client-3.1.3.jar:/export/server/hadoop/share/hadoop/hdfs/hadoop-hdfs-native-client-3.1.3-tests.jar:/export/server/hadoop/share/hadoop/hdfs/hadoop-hdfs-native-client-3.1.3.jar:/export/server/hadoop/share/hadoop/hdfs/hadoop-hdfs-rbf-3.1.3-tests.jar:/export/server/hadoop/share/hadoop/hdfs/hadoop-hdfs-rbf-3.1.3.jar:/export/server/hadoop/share/hadoop/hdfs/hadoop-hdfs-httpfs-3.1.3.jar:/export/server/hadoop/share/hadoop/mapreduce/lib/hamcrest-core-1.3.jar:/export/server/hadoop/share/hadoop/mapreduce/lib/junit-4.11.jar:/export/server/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-3.1.3.jar:/export/server/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-3.1.3.jar:/export/server/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-3.1.3.jar:/export/server/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-3.1.3.jar:/export/server/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-3.1.3.jar:/export/server/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-3.1.3-tests.jar:/export/server/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-3.1.3.jar:/export/server/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-nativetask-3.1.3.jar:/export/server/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-3.1.3.jar:/export/server/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-uploader-3.1.3.jar:/export/server/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-3.1.3.jar:/export/server/hadoop/share/hadoop/yarn:/export/server/hadoop/share/hadoop/yarn/lib/HikariCP-java7-2.4.12.jar:/export/server/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/export/server/hadoop/share/hadoop/yarn/lib/dnsjava-2.1.7.jar:/export/server/hadoop/share/hadoop/yarn/lib/ehcache-3.3.1.jar:/export/server/hadoop/share/hadoop/yarn/lib/fst-2.50.jar:/export/server/hadoop/share/hadoop/yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/export/server/hadoop/share/hadoop/yarn/lib/guice-4.0.jar:/export/server/hadoop/share/hadoop/yarn/lib/guice-servlet-4.0.jar:/export/server/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-base-2.7.8.jar:/export/server/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-json-provider-2.7.8.jar:/export/server/hadoop/share/hadoop/yarn/lib/jackson-module-jaxb-annotations-2.7.8.jar:/export/server/hadoop/share/hadoop/yarn/lib/java-util-1.9.0.jar:/export/server/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/export/server/hadoop/share/hadoop/yarn/lib/jersey-client-1.19.jar:/export/server/hadoop/share/hadoop/yarn/lib/jersey-guice-1.19.jar:/export/server/hadoop/share/hadoop/yarn/lib/json-io-2.5.1.jar:/export/server/hadoop/share/hadoop/yarn/lib/metrics-core-3.2.4.jar:/export/server/hadoop/share/hadoop/yarn/lib/mssql-jdbc-6.2.1.jre7.jar:/export/server/hadoop/share/hadoop/yarn/lib/objenesis-1.0.jar:/export/server/hadoop/share/hadoop/yarn/lib/snakeyaml-1.16.jar:/export/server/hadoop/share/hadoop/yarn/lib/swagger-annotations-1.5.4.jar:/export/server/hadoop/share/hadoop/yarn/hadoop-yarn-api-3.1.3.jar:/export/server/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-3.1.3.jar:/export/server/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-3.1.3.jar:/export/server/hadoop/share/hadoop/yarn/hadoop-yarn-client-3.1.3.jar:/export/server/hadoop/share/hadoop/yarn/hadoop-yarn-common-3.1.3.jar:/export/server/hadoop/share/hadoop/yarn/hadoop-yarn-registry-3.1.3.jar:/export/server/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-3.1.3.jar:/export/server/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-3.1.3.jar:/export/server/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-3.1.3.jar:/export/server/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-3.1.3.jar:/export/server/hadoop/share/hadoop/yarn/hadoop-yarn-server-router-3.1.3.jar:/export/server/hadoop/share/hadoop/yarn/hadoop-yarn-server-sharedcachemanager-3.1.3.jar:/export/server/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-3.1.3.jar:/export/server/hadoop/share/hadoop/yarn/hadoop-yarn-server-timeline-pluginstorage-3.1.3.jar:/export/server/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-3.1.3.jar:/export/server/hadoop/share/hadoop/yarn/hadoop-yarn-services-api-3.1.3.jar:/export/server/hadoop/share/hadoop/yarn/hadoop-yarn-services-core-3.1.3.jar STARTUP_MSG: build = https://gitbox.apache.org/repos/asf/hadoop.git -r ba631c436b806728f8ec2f54ab1e289526c90579; compiled by 'ztang' on 2019-09-12T02:47Z STARTUP_MSG: java = 1.8.0_241 ************************************************************/ 2025-10-14 03:15:33,677 INFO namenode.NameNode: registered UNIX signal handlers for [TERM, HUP, INT] 2025-10-14 03:15:33,821 ERROR conf.Configuration: error parsing conf core-site.xml com.ctc.wstx.exc.WstxEOFException: Unexpected EOF; was expecting a close tag for element <configuration> at [row,col,system-id]: [31,0,"file:/export/server/hadoop/etc/hadoop/core-site.xml"] at com.ctc.wstx.sr.StreamScanner.throwUnexpectedEOF(StreamScanner.java:687) at com.ctc.wstx.sr.BasicStreamReader.throwUnexpectedEOF(BasicStreamReader.java:5608) at com.ctc.wstx.sr.BasicStreamReader.nextFromTree(BasicStreamReader.java:2802) at com.ctc.wstx.sr.BasicStreamReader.next(BasicStreamReader.java:1123) at org.apache.hadoop.conf.Configuration$Parser.parseNext(Configuration.java:3320) at org.apache.hadoop.conf.Configuration$Parser.parse(Configuration.java:3114) at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3007) at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2968) at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2848) at org.apache.hadoop.conf.Configuration.get(Configuration.java:1200) at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1812) at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1789) at org.apache.hadoop.util.ShutdownHookManager.getShutdownTimeout(ShutdownHookManager.java:183) at org.apache.hadoop.util.ShutdownHookManager$HookEntry.<init>(ShutdownHookManager.java:207) at org.apache.hadoop.util.ShutdownHookManager.addShutdownHook(ShutdownHookManager.java:304) at org.apache.hadoop.util.StringUtils.startupShutdownMessage(StringUtils.java:723) at org.apache.hadoop.util.StringUtils.startupShutdownMessage(StringUtils.java:707) at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1754) 2025-10-14 03:15:33,824 ERROR namenode.NameNode: Failed to start namenode. java.lang.RuntimeException: com.ctc.wstx.exc.WstxEOFException: Unexpected EOF; was expecting a close tag for element <configuration> at [row,col,system-id]: [31,0,"file:/export/server/hadoop/etc/hadoop/core-site.xml"] at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3024) at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2968) at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2848) at org.apache.hadoop.conf.Configuration.get(Configuration.java:1200) at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1812) at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1789) at org.apache.hadoop.util.ShutdownHookManager.getShutdownTimeout(ShutdownHookManager.java:183) at org.apache.hadoop.util.ShutdownHookManager$HookEntry.<init>(ShutdownHookManager.java:207) at org.apache.hadoop.util.ShutdownHookManager.addShutdownHook(ShutdownHookManager.java:304) at org.apache.hadoop.util.StringUtils.startupShutdownMessage(StringUtils.java:723) at org.apache.hadoop.util.StringUtils.startupShutdownMessage(StringUtils.java:707) at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1754) Caused by: com.ctc.wstx.exc.WstxEOFException: Unexpected EOF; was expecting a close tag for element <configuration> at [row,col,system-id]: [31,0,"file:/export/server/hadoop/etc/hadoop/core-site.xml"] at com.ctc.wstx.sr.StreamScanner.throwUnexpectedEOF(StreamScanner.java:687) at com.ctc.wstx.sr.BasicStreamReader.throwUnexpectedEOF(BasicStreamReader.java:5608) at com.ctc.wstx.sr.BasicStreamReader.nextFromTree(BasicStreamReader.java:2802) at com.ctc.wstx.sr.BasicStreamReader.next(BasicStreamReader.java:1123) at org.apache.hadoop.conf.Configuration$Parser.parseNext(Configuration.java:3320) at org.apache.hadoop.conf.Configuration$Parser.parse(Configuration.java:3114) at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3007) ... 11 more 2025-10-14 03:15:33,899 INFO util.ExitUtil: Exiting with status 1: java.lang.RuntimeException: com.ctc.wstx.exc.WstxEOFException: Unexpected EOF; was expecting a close tag for element <configuration> at [row,col,system-id]: [31,0,"file:/export/server/hadoop/etc/hadoop/core-site.xml"] 2025-10-14 03:15:33,918 ERROR conf.Configuration: error parsing conf core-site.xml com.ctc.wstx.exc.WstxEOFException: Unexpected EOF; was expecting a close tag for element <configuration> at [row,col,system-id]: [31,0,"file:/export/server/hadoop/etc/hadoop/core-site.xml"] at com.ctc.wstx.sr.StreamScanner.throwUnexpectedEOF(StreamScanner.java:687) at com.ctc.wstx.sr.BasicStreamReader.throwUnexpectedEOF(BasicStreamReader.java:5608) at com.ctc.wstx.sr.BasicStreamReader.nextFromTree(BasicStreamReader.java:2802) at com.ctc.wstx.sr.BasicStreamReader.next(BasicStreamReader.java:1123) at org.apache.hadoop.conf.Configuration$Parser.parseNext(Configuration.java:3320) at org.apache.hadoop.conf.Configuration$Parser.parse(Configuration.java:3114) at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3007) at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2968) at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2848) at org.apache.hadoop.conf.Configuration.get(Configuration.java:1200) at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1812) at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1789) at org.apache.hadoop.util.ShutdownHookManager.getShutdownTimeout(ShutdownHookManager.java:183) at org.apache.hadoop.util.ShutdownHookManager.shutdownExecutor(ShutdownHookManager.java:145) at org.apache.hadoop.util.ShutdownHookManager.access$300(ShutdownHookManager.java:65) at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:102) Exception in thread "Thread-1" java.lang.RuntimeException: com.ctc.wstx.exc.WstxEOFException: Unexpected EOF; was expecting a close tag for element <configuration> at [row,col,system-id]: [31,0,"file:/export/server/hadoop/etc/hadoop/core-site.xml"] at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3024) at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2968) at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2848) at org.apache.hadoop.conf.Configuration.get(Configuration.java:1200) at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1812) at org.apache.hadoop.conf.Configuration.getTimeDuration(Configuration.java:1789) at org.apache.hadoop.util.ShutdownHookManager.getShutdownTimeout(ShutdownHookManager.java:183) at org.apache.hadoop.util.ShutdownHookManager.shutdownExecutor(ShutdownHookManager.java:145) at org.apache.hadoop.util.ShutdownHookManager.access$300(ShutdownHookManager.java:65) at org.apache.hadoop.util.ShutdownHookManager$1.run(ShutdownHookManager.java:102) Caused by: com.ctc.wstx.exc.WstxEOFException: Unexpected EOF; was expecting a close tag for element <configuration> at [row,col,system-id]: [31,0,"file:/export/server/hadoop/etc/hadoop/core-site.xml"] at com.ctc.wstx.sr.StreamScanner.throwUnexpectedEOF(StreamScanner.java:687) at com.ctc.wstx.sr.BasicStreamReader.throwUnexpectedEOF(BasicStreamReader.java:5608) at com.ctc.wstx.sr.BasicStreamReader.nextFromTree(BasicStreamReader.java:2802) at com.ctc.wstx.sr.BasicStreamReader.next(BasicStreamReader.java:1123) at org.apache.hadoop.conf.Configuration$Parser.parseNext(Configuration.java:3320) at org.apache.hadoop.conf.Configuration$Parser.parse(Configuration.java:3114) at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:3007) ... 9 more [root@master bin]# cd /sbin [root@master sbin]# start-dfs.sh -bash: /usr/sbin/start-dfs.sh: 权限不够 [root@master sbin]# sudo start-dfs.sh sudo: start-dfs.sh:找不到命令 [root@master sbin]# vi core-site.xml [root@master sbin]# cd /export/server [root@master server]# vi core-site.xml [root@master server]# vi hdfs-site.xml [root@master server]# cd /export/server/hadoop/etc/hadoop/ [root@master hadoop]# ls capacity-scheduler.xml kms-log4j.properties configuration.xsl kms-site.xml container-executor.cfg log4j.properties core-site.xml mapred-env.cmd hadoop-env.cmd mapred-env.sh hadoop-env.sh mapred-queues.xml.template hadoop-metrics2.properties mapred-site.xml hadoop-policy.xml shellprofile.d hadoop-user-functions.sh.example ssl-client.xml.example hdfs-site.xml ssl-server.xml.example httpfs-env.sh user_ec_policies.xml.template httpfs-log4j.properties workers httpfs-signature.secret yarn-env.cmd httpfs-site.xml yarn-env.sh kms-acls.xml yarnservice-log4j.properties kms-env.sh yarn-site.xml [root@master hadoop]# vi core-site.xml [root@master hadoop]# cd /sbin [root@master sbin]# start-dfs.sh -bash: /usr/sbin/start-dfs.sh: 权限不够 [root@master sbin]# sudo chomd +x /usr/sbin/start-dfs.sh sudo: chomd:找不到命令 [root@master sbin]# chomd +x /usr/sbin/start-dfs.sh -bash: chomd: 未找到命令 [root@master sbin]# chmod +x /usr/sbin/start-dfs.sh [root@master sbin]# start-dfs.sh [root@master sbin]# jps 18301 Jps [root@master sbin]# ../start-dfs.sh -bash: ../start-dfs.sh: 没有那个文件或目录 [root@master sbin]# ./start-dfs.sh [root@master sbin]# jps 18872 Jps [root@master sbin]#
最新发布
10-15
这是一个“普通文本型”问题,涉及Hadoop配置过程中的多个操作错误与异常分析。以下是对整个过程的系统性分析和解决方案说明。 --- ### 问题总览 你在配置 Hadoop 虚拟机时执行了若干命令,遇到了多个问题,主要包括: 1. 错误使用 `su ./start-dfs.sh` 导致用户不存在; 2. 执行 `hdfs namenode -format` 时报错 XML 解析异常(EOF); 3. 多次尝试运行 `start-dfs.sh` 提示权限不足或找不到文件; 4. 拼写错误:`chomd` 应为 `chmod`; 5. 最终即使赋权并运行脚本,`jps` 显示无 NameNode 进程,说明集群未正常启动。 根本原因在于 **`core-site.xml` 文件格式错误**,导致 Hadoop 配置无法加载,进而使 NameNode 启动失败。 --- ### 关键错误分析 #### 1. `error parsing conf core-site.xml: Unexpected EOF` ```log ERROR conf.Configuration: error parsing conf core-site.xml com.ctc.wstx.exc.WstxEOFException: Unexpected EOF; was expecting a close tag for element <configuration> at [row,col,system-id]: [31,0,"file:/export/server/hadoop/etc/hadoop/core-site.xml"] ``` 此错误明确指出: - 文件路径是 `/export/server/hadoop/etc/hadoop/core-site.xml` - 在第 31 行发生“意外结束”,即文件没有正确闭合 `<configuration>` 标签。 - 可能原因是缺少 `</configuration>` 或某个 `<property>` 标签未闭合(如漏掉 `</name>` 或 `</value>`)。 > ⚠️ Hadoop 的配置文件必须是合法的 XML,任何语法错误都会导致服务无法启动--- #### 2. `su ./start-dfs.sh` 错误 ```bash su: user ./start-dfs.sh does not exist ``` 你误将 `su`(切换用户)用于执行脚本。 ✅ 正确做法是直接运行脚本或使用 `./start-dfs.sh`,不需要 `su`。 --- #### 3. `Permission denied` 和 `sudo: command not found` ```bash -bash: /usr/sbin/start-dfs.sh: 权限不够 sudo: start-dfs.sh:找不到命令 ``` 这表明: - 你可能把脚本路径搞错了 —— 实际上 `start-dfs.sh` 位于 `$HADOOP_HOME/sbin/`,而不是 `/usr/sbin/`。 - 你的 PATH 可能没包含 Hadoop 的 sbin 目录。 - 你还拼错了 `chmod` 成 `chomd`,这是常见笔误。 --- #### 4. `jps` 无 NameNode、DataNode 等进程 说明虽然脚本看似“运行成功”,但由于配置文件出错,NameNode 实际上未能启动自动退出--- ### 解决方案步骤 #### ✅ 第一步:修复 `core-site.xml` 进入配置目录并编辑文件: ```bash cd /export/server/hadoop/etc/hadoop/ vi core-site.xml ``` 确保其结构如下(示例): ```xml <?xml version="1.0"?> <configuration> <property> <name>fs.defaultFS</name> <value>hdfs://master:9000</value> </property> <property> <name>hadoop.tmp.dir</name> <value>/export/data/hadoop/tmp</value> </property> </configuration> ``` 检查点: - 是否有 `<?xml version="1.0"?>` 声明? - 是否以 `<configuration>` 开始,且以 `</configuration>` 结束? - 每个 `<property>` 中是否有完整的 `<name>` 和 `<value>`,并且都正确闭合? 💡 推荐用工具验证 XML 合法性(如在线 XML 校验器)。 --- #### ✅ 第二步:确认脚本位置和权限 Hadoop 的管理脚本在 `$HADOOP_HOME/sbin/`,不是 `/usr/sbin/`。 所以你应该这样操作: ```bash cd /export/server/hadoop/sbin ./start-dfs.sh ``` 如果提示权限不足,先添加执行权限: ```bash chmod +x /export/server/hadoop/sbin/*.sh ``` 不要对 `/usr/sbin/` 下的文件操作,除非你做了软链接。 --- #### ✅ 第三步:重新格式化 NameNode(可选) 仅当确认 `core-site.xml` 已修复后执行: ```bash hdfs namenode -format ``` ⚠️ 注意:生产环境慎用此命令,会清空 HDFS 数据。 --- #### ✅ 第四步:启动 DFS 并查看进程 ```bash /export/server/hadoop/sbin/start-dfs.sh jps ``` 应看到以下关键进程: - NameNode - DataNode - SecondaryNameNode(若配置) 如果没有,请再次检查日志: ```bash cat /export/server/hadoop/logs/hadoop-*-namenode-*.log ``` --- #### ✅ 第五步:设置环境变量(推荐) 将 Hadoop 添加到环境变量中,避免路径混乱: 编辑 `~/.bashrc`: ```bash export HADOOP_HOME=/export/server/hadoop export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin ``` 然后生效: ```bash source ~/.bashrc ``` 之后可以直接使用 `start-dfs.sh`、`hdfs` 等命令。 --- ### 知识点详解 - **Hadoop 配置文件必须为合法 XML** 所有 `.xml` 配置文件需严格遵循 XML 规范,包括标签闭合、嵌套正确,否则服务无法解析而启动失败。 - **start-dfs.sh 脚本位置与权限** 该脚本位于 `$HADOOP_HOME/sbin/`,需具备执行权限(x),并通过正确路径调用,不可误放至系统目录。 - **jps 是 Java 进程查看工具** 用于确认 Hadoop 守护进程是否运行;NameNode/Datanode 缺失通常意味着配置错误或端口占用。 --- ### 建议总结 | 问题 | 解决方式 | |------|----------| | `core-site.xml` 报错 | 修复 XML,补全闭合标签 | | `su ./start-dfs.sh` 错误 | 改为 `./start-dfs.sh` | | 权限不足 | 使用 `chmod +x *.sh` | | 找不到命令 | 不要用 `/usr/sbin/`,去 `$HADOOP_HOME/sbin/` | | jps 无进程 | 修复配置后再启动 | --- 如有需要,你可以贴出 `core-site.xml` 内容,我可以帮你逐行检查。祝你 Hadoop 配置顺利!
评论
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值