2023.5.19Hadoop具体操作(四种)

本文介绍了如何配置Linux虚拟机的静态IP,包括编辑网络接口配置文件和重启网络服务。接着,展示了在MySQL、HBase、Redis和MongoDB中设计和操作学生表的过程,包括插入、查询和修改数据。此外,还提醒了MongoDBshell已被mongosh替代,并给出了启动警告的解决建议。

大作业

1、ens33没有地址

查看虚拟机的NAT8网段

image-20230519085947846

使用ip a显示ens33的ip

ip a

image-20230519090114228

设置静态ip

编辑网络接口配置文件:输入以下命令来编辑网络接口的配置文件:

sudo vi /etc/network/interfaces

在打开的文件中,找到要设置为静态IP的网络接口(如ens33)的部分。

将该部分的配置修改为以下内容:

auto ens33
iface ens33 inet static
    address 192.168.0.2    # 设置静态IP地址
    netmask 255.255.255.0  # 设置子网掩码
    gateway 192.168.0.1    # 设置网关IP地址
    dns-nameservers 8.8.8.8 8.8.4.4  # 设置DNS服务器IP地址,可以根据需要添加多个DNS服务器

注意:确保将IP地址、子网掩码、网关和DNS服务器地址替换为适合你网络环境的值。

按下Ctrl + O保存更改,然后按下Ctrl + X关闭nano编辑器。

重启网络服务:输入以下命令来重启网络服务以应用新的配置:

sudo service networking restart

或者,如果你使用的是Network Manager,可以使用以下命令:

sudo service network-manager restart

这将使新的静态IP地址生效。

2、修改主机名

image-20230519085612771

3、操作具体数据库

1、MYSQL

3.1 MySQL数据库操作

Student学生表

NameEnglishMathComputer
zhangsan698677
lisi5510088

\1. 根据上面给出的表格,利用MySQL5.6设计出student学生表格;

image-20230522140108566

a) 设计完后,用select语句输出所有的相关信息,并给出截图;

image-20230522140147931

b) 查询zhangsan的Computer成绩,并给出截图;

image-20230522140223796

c) 修改lisi的Math成绩,改为95.给出截图.

image-20230522140248550

2.根据上面已经设计出的student表,用MySQL操作

a) 添加数据:English:45 Math:89 Computer:100

scofield4589100

image-20230522140313811

b) 获取scofield的English成绩信息

image-20230522140333111

-- 创建student表
CREATE TABLE Student (
  Name VARCHAR(20),
  English INT,
  Math INT,
  Computer INT
);

-- 插入数据
INSERT INTO Student (Name, English, Math, Computer)
VALUES ('zhangsan', 69, 86, 77),
       ('lisi', 55, 100, 88);

-- 查询所有学生信息
SELECT * FROM Student;

-- 查询zhangsan的Computer成绩
SELECT Computer FROM Student WHERE Name = 'zhangsan';

-- 修改lisi的Math成绩为95
UPDATE Student SET Math = 95 WHERE Name = 'lisi';

-- 查询所有学生信息
SELECT * FROM Student;

-- 添加数据: scofield English:45 Math:89 Computer:100
INSERT INTO Student (Name, English, Math, Computer)
VALUES ('scofield', 45, 89, 100);

-- 获取scofield的English成绩信息
SELECT English FROM Student WHERE Name = 'scofield';

2、HBase

3.2 HBase数据库操作

image-20230522140639873

Student学生表

namescore
EnglishMathComputer
zhangsan698677
lisi5510088

\1. 根据上面给出的表格,用Hbase Shell模式设计student学生表格。

image-20230522140822580

a) 设计完后,用scan指令浏览表的相关信息,给出截图。

image-20230522140918632

b) 查询zhangsan 的Computer成绩,给出截图。

image-20230522141016561

c) 修改lisi的Math成绩,改为95,给出截图。

image-20230522141035849

\2. 根据上面已经设计出的student,用Hbase 操作

a) 添加数据:English:45 Math:89 Computer:100

image-20230522141121761

scofield4589100

b) 获取scofield的English成绩信息

image-20230522141154117

# 创建表格
create 'Student', 'score'

# 插入数据
put 'Student', 'zhangsan', 'score:English', '69'
put 'Student', 'zhangsan', 'score:Math', '86'
put 'Student', 'zhangsan', 'score:Computer', '77'
put 'Student', 'lisi', 'score:English', '55'
put 'Student', 'lisi', 'score:Math', '100'
put 'Student', 'lisi', 'score:Computer', '88'

# 浏览表的相关信息
scan 'Student'

# 查询zhangsan的Computer成绩
get 'Student', 'zhangsan', 'score:Computer'

# 修改lisi的Math成绩为95
put 'Student', 'lisi', 'score:Math', '95'

# 浏览表的相关信息
scan 'Student'

# 添加数据
put 'Student', 'scofield', 'score:English', '45'
put 'Student', 'scofield', 'score:Math', '89'
put 'Student', 'scofield', 'score:Computer', '100'

# 获取scofield的English成绩信息
get 'Student', 'scofield', 'score:English'

image-20230522141325066

3、Redis

image-20230522141447693

Redis数据库操作

Student 键值对:

zhangsan:{

English: 69

Math: 86

Computer: 77

lisi:{

English: 55

Math: 100

Computer: 88

\1. 根据上面给出的键值对,用Redis的哈希结构设计出上述表格;(键值可以用student.zhangsan,student.lisi来表示两个键值属于同一个表格)

image-20230522141608355

a) 设计完之后,用hgetall命令分别输出zhangsan和lisi的成绩信息,并截图;

image-20230522142039509

image-20230522142056249

b) 用hget命令查询zhangsan 的Computer成绩,给出截图。

image-20230522142115671

c) 修改lisi的Math成绩,改为95,给出截图。

image-20230522142139315

\2. 根据上面已经设计出的student表格,用Redis操作

a ) 添加数据:English:45 Math:89 Computer:100

scofield:{

English: 45

Math: 89

Computer: 100

image-20230522142306404

b) 获取scofield的English成绩信息

image-20230522142233197

# 设计哈希结构表格
HSET student.zhangsan English 69
HSET student.zhangsan Math 86
HSET student.zhangsan Computer 77

HSET student.lisi English 55
HSET student.lisi Math 100
HSET student.lisi Computer 88

# 输出zhangsan和lisi的成绩信息
HGETALL student.zhangsan
HGETALL student.lisi

# 查询zhangsan的Computer成绩
HGET student.zhangsan Computer

# 修改lisi的Math成绩为95
HSET student.lisi Math 95

# 输出zhangsan和lisi的成绩信息
HGETALL student.zhangsan
HGETALL student.lisi

# 添加数据
HSET student.scofield English 45
HSET student.scofield Math 89
HSET student.scofield Computer 100

# 获取scofield的English成绩信息
HGET student.scofield English

image-20230522141727899

(error) ERR wrong number of arguments for 'hgetall' command
127.0.0.1:6379> 

在Redis中,错误消息"ERR wrong number of arguments for ‘hgetall’ command"表示对于’hgetall’命令使用了错误数量的参数。这通常意味着你没有提供’hgetall’命令所需的键值参数。

'hgetall’命令用于获取哈希表中指定键的所有字段和值。它的语法是:

vbnet
HGETALL key

其中,'key’是你要获取哈希表的键。根据错误消息,你可能没有提供’key’参数,或者提供了错误的参数。

要解决这个问题,你需要在执行’hgetall’命令时提供正确的参数,确保传递正确的键。例如:


HGETALL myhash

这将获取名为"myhash"的哈希表的所有字段和值。

请确保在Redis命令行中正确输入命令,并提供所需的参数。如果问题仍然存在,请检查你的代码或脚本,确保在使用Redis命令时正确传递参数。

4、MongoDB

image-20230522142552002

3.4 MongoDB数据库操作

Student文档如下:

{

“name”: “zhangsan”,

“score”: {

“English”: 69,

“Math”: 86,

“Computer”: 77

}

}

{

“name”: “lisi”,

“score”: {

“English”: 55,

“Math”: 100,

“Computer”: 88

}

}

\1. 根据上面给出的文档,用Mongo shell设计出student集合.

image-20230522142825418

image-20230522142850857

a) 设计完后,用find()方法输出两个学生的信息,给出截图;

image-20230522142917422

b) 用find函数查询zhangsan 的所有成绩(只显示score列),给出截图。

image-20230522142939312

c) 修改lisi的Math成绩,改为95,给出截图。

image-20230522143002889

\2. 根据上面已经设计出的student集合,用Mongo shell操作

a) 添加数据:English:45 Math:89 Computer:100

{

“name”: “scofield”,

“score”: {

“English”: 45,

“Math”: 89,

“Computer”: 100

}

}

image-20230522143059067

b) 获取scofield的所有成绩成绩信息(只显示score列)

image-20230522143119153

// 设计student集合并插入文档
db.student.insertOne({
  "name": "zhangsan",
  "score": {
    "English": 69,
    "Math": 86,
    "Computer": 77
  }
});

db.student.insertOne({
  "name": "lisi",
  "score": {
    "English": 55,
    "Math": 100,
    "Computer": 88
  }
});

// 输出两个学生的信息
db.student.find();

// 查询zhangsan的所有成绩
db.student.find({ "name": "zhangsan" }, { "score": 1, "_id": 0 });

// 修改lisi的Math成绩为95
db.student.updateOne({ "name": "lisi" }, { $set: { "score.Math": 95 } });

// 输出两个学生的信息
db.student.find();

// 添加数据
db.student.insertOne({
  "name": "scofield",
  "score": {
    "English": 45,
    "Math": 89,
    "Computer": 100
  }
});

// 获取scofield的所有成绩信息
db.student.find({ "name": "scofield" }, { "score": 1, "_id": 0 });

image-20230522142708657

Warning: the "mongo" shell has been superseded by "mongosh",
which delivers improved usability and compatibility.The "mongo" shell has been deprecated and will be removed in
an upcoming release.
For installation instructions, see
https://docs.mongodb.com/mongodb-shell/install/
================
---
The server generated these startup warnings when booting: 
        2023-05-20T07:40:59.745-07:00: Using the XFS filesystem is strongly recommended with the WiredTiger storage engine. See http://dochub.mongodb.org/core/prodnotes-filesystem
        2023-05-20T07:41:01.362-07:00: Access control is not enabled for the database. Read and write access to data and configuration is unrestricted
---

image-20230522143318583

根据警告信息,你收到了两个启动警告:

  1. 使用WiredTiger存储引擎时强烈建议使用XFS文件系统。
  2. 数据库未启用访问控制,对数据和配置的读写访问是不受限制的。

以下是解决这些警告的建议:

  1. 使用XFS文件系统:警告指出在使用WiredTiger存储引擎时强烈建议使用XFS文件系统。XFS对于MongoDB和WiredTiger存储引擎来说是最佳选择,因为它具有出色的性能和可扩展性。如果你当前没有使用XFS文件系统,请考虑迁移到XFS。你可以按照此处的文档了解有关XFS文件系统的更多信息。
  2. 启用访问控制:警告指出数据库未启用访问控制,这意味着任何人都可以具有对数据和配置的读写权限。这对于生产环境来说是不安全的。为了保护你的MongoDB数据库,你应该启用访问控制并为用户设置适当的权限。有关如何启用访问控制和配置用户权限的详细信息,请参阅MongoDB的官方文档中有关访问控制的部分。

解决这些警告需要进行适当的配置更改和系统管理。确保在进行任何更改之前,详细阅读相关文档,并确保理解对你的系统和环境可能产生的影响。

/* * Copyright (c) 1994, 2008, Oracle and/or its affiliates. All rights reserved. * ORACLE PROPRIETARY/CONFIDENTIAL. Use is subject to license terms. * * * * * * * * * * * * * * * * * * * * */ package java.io; /** * Signals that an attempt to open the file denoted by a specified pathname * has failed. * * <p> This exception will be thrown by the {@link FileInputStream}, {@link * FileOutputStream}, and {@link RandomAccessFile} constructors when a file * with the specified pathname does not exist. It will also be thrown by these * constructors if the file does exist but for some reason is inaccessible, for * example when an attempt is made to open a read-only file for writing. * * @author unascribed * @since JDK1.0 */ public class FileNotFoundException extends IOException { private static final long serialVersionUID = -897856973823710492L; /** * Constructs a <code>FileNotFoundException</code> with * <code>null</code> as its error detail message. */ public FileNotFoundException() { super(); } /** * Constructs a <code>FileNotFoundException</code> with the * specified detail message. The string <code>s</code> can be * retrieved later by the * <code>{@link java.lang.Throwable#getMessage}</code> * method of class <code>java.lang.Throwable</code>. * * @param s the detail message. */ public FileNotFoundException(String s) { super(s); } /** * Constructs a <code>FileNotFoundException</code> with a detail message * consisting of the given pathname string followed by the given reason * string. If the <code>reason</code> argument is <code>null</code> then * it will be omitted. This private constructor is invoked only by native * I/O methods. * * @since 1.2 */ private FileNotFoundException(String path, String reason) { super(path + ((reason == null) ? "" : " (" + reason + ")")); } } 运行后报错 D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\bin\java.exe "-javaagent:D:\2025.9\JAVA\IDEA 2023.1\IntelliJ IDEA 2023.1\lib\idea_rt.jar=11918:D:\2025.9\JAVA\IDEA 2023.1\IntelliJ IDEA 2023.1\bin" -Dfile.encoding=UTF-8 -classpath D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\charsets.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\deploy.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\ext\access-bridge-64.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\ext\cldrdata.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\ext\dnsns.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\ext\jaccess.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\ext\jfxrt.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\ext\localedata.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\ext\nashorn.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\ext\sunec.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\ext\sunjce_provider.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\ext\sunmscapi.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\ext\sunpkcs11.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\ext\zipfs.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\javaws.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\jce.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\jfr.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\jfxswt.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\jsse.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\management-agent.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\plugin.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\resources.jar;D:\2025.9\Hadoop\test\4\jdk1.8\jdk1.8\jdk1.8.0_181\jre\lib\rt.jar;D:\2025.9\Hadoop\java\hadoop_project\out\production\hadoop_project;D:\2025.9\Hadoop\test\4\lib2\fst-2.50.jar;D:\2025.9\Hadoop\test\4\lib2\re2j-1.1.jar;D:\2025.9\Hadoop\test\4\lib2\asm-5.0.4.jar;D:\2025.9\Hadoop\test\4\lib2\guice-4.0.jar;D:\2025.9\Hadoop\test\4\lib2\jna-5.2.0.jar;D:\2025.9\Hadoop\test\4\lib2\avro-1.7.7.jar;D:\2025.9\Hadoop\test\4\lib2\gson-2.8.9.jar;D:\2025.9\Hadoop\test\4\lib2\okio-2.8.0.jar;D:\2025.9\Hadoop\test\4\lib2\jline-3.9.0.jar;D:\2025.9\Hadoop\test\4\lib2\jsch-0.1.55.jar;D:\2025.9\Hadoop\test\4\lib2\asm-tree-9.1.jar;D:\2025.9\Hadoop\test\4\lib2\jettison-1.1.jar;D:\2025.9\Hadoop\test\4\lib2\jsr305-3.0.2.jar;D:\2025.9\Hadoop\test\4\lib2\log4j-1.2.17.jar;D:\2025.9\Hadoop\test\4\lib2\okhttp-4.9.3.jar;D:\2025.9\Hadoop\test\4\lib2\dnsjava-2.1.7.jar;D:\2025.9\Hadoop\test\4\lib2\ehcache-3.3.1.jar;D:\2025.9\Hadoop\test\4\lib2\json-io-2.5.1.jar;D:\2025.9\Hadoop\test\4\lib2\objenesis-2.6.jar;D:\2025.9\Hadoop\test\4\lib2\paranamer-2.3.jar;D:\2025.9\Hadoop\test\4\lib2\guava-27.0-jre.jar;D:\2025.9\Hadoop\test\4\lib2\javax.inject-1.jar;D:\2025.9\Hadoop\test\4\lib2\snakeyaml-1.26.jar;D:\2025.9\Hadoop\test\4\lib2\aopalliance-1.0.jar;D:\2025.9\Hadoop\test\4\lib2\asm-commons-9.1.jar;D:\2025.9\Hadoop\test\4\lib2\commons-cli-1.2.jar;D:\2025.9\Hadoop\test\4\lib2\commons-net-3.6.jar;D:\2025.9\Hadoop\test\4\lib2\httpcore-4.4.13.jar;D:\2025.9\Hadoop\test\4\lib2\java-util-1.9.0.jar;D:\2025.9\Hadoop\test\4\lib2\jaxb-api-2.2.11.jar;D:\2025.9\Hadoop\test\4\lib2\kerb-core-1.0.1.jar;D:\2025.9\Hadoop\test\4\lib2\kerb-util-1.0.1.jar;D:\2025.9\Hadoop\test\4\lib2\kerby-xdr-1.0.1.jar;D:\2025.9\Hadoop\test\4\lib2\reload4j-1.2.22.jar;D:\2025.9\Hadoop\test\4\lib2\stax2-api-4.2.1.jar;D:\2025.9\Hadoop\test\4\lib2\zookeeper-3.5.6.jar;D:\2025.9\Hadoop\test\4\lib2\asm-analysis-9.1.jar;D:\2025.9\Hadoop\test\4\lib2\commons-io-2.8.0.jar;D:\2025.9\Hadoop\test\4\lib2\commons-text-1.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-kms-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-nfs-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\jersey-core-1.19.jar;D:\2025.9\Hadoop\test\4\lib2\jersey-json-1.19.jar;D:\2025.9\Hadoop\test\4\lib2\json-smart-2.4.7.jar;D:\2025.9\Hadoop\test\4\lib2\jsr311-api-1.1.1.jar;D:\2025.9\Hadoop\test\4\lib2\kerb-admin-1.0.1.jar;D:\2025.9\Hadoop\test\4\lib2\kerby-asn1-1.0.1.jar;D:\2025.9\Hadoop\test\4\lib2\kerby-pkix-1.0.1.jar;D:\2025.9\Hadoop\test\4\lib2\kerby-util-1.0.1.jar;D:\2025.9\Hadoop\test\4\lib2\slf4j-api-1.7.36.jar;D:\2025.9\Hadoop\test\4\lib2\failureaccess-1.0.jar;D:\2025.9\Hadoop\test\4\lib2\guice-servlet-4.0.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-auth-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-hdfs-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\httpclient-4.5.13.jar;D:\2025.9\Hadoop\test\4\lib2\jackson-xc-1.9.13.jar;D:\2025.9\Hadoop\test\4\lib2\jaxb-impl-2.2.3-1.jar;D:\2025.9\Hadoop\test\4\lib2\jersey-guice-1.19.jar;D:\2025.9\Hadoop\test\4\lib2\json-simple-1.1.1.jar;D:\2025.9\Hadoop\test\4\lib2\kerb-client-1.0.1.jar;D:\2025.9\Hadoop\test\4\lib2\kerb-common-1.0.1.jar;D:\2025.9\Hadoop\test\4\lib2\kerb-crypto-1.0.1.jar;D:\2025.9\Hadoop\test\4\lib2\kerb-server-1.0.1.jar;D:\2025.9\Hadoop\test\4\lib2\checker-qual-2.5.2.jar;D:\2025.9\Hadoop\test\4\lib2\commons-codec-1.15.jar;D:\2025.9\Hadoop\test\4\lib2\jersey-client-1.19.jar;D:\2025.9\Hadoop\test\4\lib2\jersey-server-1.19.jar;D:\2025.9\Hadoop\test\4\lib2\kerby-config-1.0.1.jar;D:\2025.9\Hadoop\test\4\lib2\leveldbjni-all-1.8.jar;D:\2025.9\Hadoop\test\4\lib2\metrics-core-3.2.4.jar;D:\2025.9\Hadoop\test\4\lib2\netty-3.10.6.Final.jar;D:\2025.9\Hadoop\test\4\lib2\bcpkix-jdk15on-1.60.jar;D:\2025.9\Hadoop\test\4\lib2\bcprov-jdk15on-1.60.jar;D:\2025.9\Hadoop\test\4\lib2\commons-math3-3.1.1.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-common-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\jackson-core-2.12.7.jar;D:\2025.9\Hadoop\test\4\lib2\jersey-servlet-1.19.jar;D:\2025.9\Hadoop\test\4\lib2\kerb-identity-1.0.1.jar;D:\2025.9\Hadoop\test\4\lib2\protobuf-java-2.5.0.jar;D:\2025.9\Hadoop\test\4\lib2\snappy-java-1.1.8.2.jar;D:\2025.9\Hadoop\test\4\lib2\woodstox-core-5.3.0.jar;D:\2025.9\Hadoop\test\4\lib2\commons-lang3-3.12.0.jar;D:\2025.9\Hadoop\test\4\lib2\curator-client-4.2.0.jar;D:\2025.9\Hadoop\test\4\lib2\jackson-jaxrs-1.9.13.jar;D:\2025.9\Hadoop\test\4\lib2\kerb-simplekdc-1.0.1.jar;D:\2025.9\Hadoop\test\4\lib2\kotlin-stdlib-1.4.10.jar;D:\2025.9\Hadoop\test\4\lib2\token-provider-1.0.1.jar;D:\2025.9\Hadoop\test\4\lib2\zookeeper-jute-3.5.6.jar;D:\2025.9\Hadoop\test\4\lib2\accessors-smart-2.4.7.jar;D:\2025.9\Hadoop\test\4\lib2\commons-compress-1.21.jar;D:\2025.9\Hadoop\test\4\lib2\commons-daemon-1.0.13.jar;D:\2025.9\Hadoop\test\4\lib2\commons-logging-1.1.3.jar;D:\2025.9\Hadoop\test\4\lib2\curator-recipes-4.2.0.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-hdfs-nfs-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-hdfs-rbf-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-registry-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-api-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\HikariCP-java7-2.4.12.jar;D:\2025.9\Hadoop\test\4\lib2\mssql-jdbc-6.2.1.jre7.jar;D:\2025.9\Hadoop\test\4\lib2\nimbus-jose-jwt-9.8.1.jar;D:\2025.9\Hadoop\test\4\lib2\slf4j-reload4j-1.7.36.jar;D:\2025.9\Hadoop\test\4\lib2\j2objc-annotations-1.1.jar;D:\2025.9\Hadoop\test\4\lib2\jcip-annotations-1.0-1.jar;D:\2025.9\Hadoop\test\4\lib2\netty-all-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\commons-beanutils-1.9.4.jar;D:\2025.9\Hadoop\test\4\lib2\curator-framework-4.2.0.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-hdfs-3.3.4-tests.jar;D:\2025.9\Hadoop\test\4\lib2\jackson-core-asl-1.9.13.jar;D:\2025.9\Hadoop\test\4\lib2\jackson-databind-2.12.7.jar;D:\2025.9\Hadoop\test\4\lib2\javax.servlet-api-3.1.0.jar;D:\2025.9\Hadoop\test\4\lib2\javax.websocket-api-1.0.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-annotations-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-hdfs-client-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-hdfs-httpfs-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-client-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-common-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\netty-codec-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\commons-collections-3.2.2.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-common-3.3.4-tests.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-shaded-guava-1.1.1.jar;D:\2025.9\Hadoop\test\4\lib2\jackson-jaxrs-base-2.12.7.jar;D:\2025.9\Hadoop\test\4\lib2\jackson-mapper-asl-1.9.13.jar;D:\2025.9\Hadoop\test\4\lib2\jetty-io-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\netty-buffer-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\netty-common-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\swagger-annotations-1.5.4.jar;D:\2025.9\Hadoop\test\4\lib2\audience-annotations-0.5.0.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-registry-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\jackson-annotations-2.12.7.jar;D:\2025.9\Hadoop\test\4\lib2\jakarta.xml.bind-api-2.3.2.jar;D:\2025.9\Hadoop\test\4\lib2\jetty-xml-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\netty-handler-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-hdfs-rbf-3.3.4-tests.jar;D:\2025.9\Hadoop\test\4\lib2\jetty-http-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\jetty-jndi-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\jetty-plus-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\jetty-util-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\kotlin-stdlib-common-1.4.10.jar;D:\2025.9\Hadoop\test\4\lib2\netty-resolver-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\commons-configuration2-2.1.1.jar;D:\2025.9\Hadoop\test\4\lib2\jakarta.activation-api-1.2.1.jar;D:\2025.9\Hadoop\test\4\lib2\netty-codec-dns-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\netty-codec-xml-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\netty-transport-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\jetty-client-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\jetty-server-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\jetty-webapp-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\netty-codec-http-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\netty-codec-mqtt-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\netty-codec-smtp-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-hdfs-client-3.3.4-tests.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-server-tests-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-services-api-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\javax.websocket-client-api-1.0.jar;D:\2025.9\Hadoop\test\4\lib2\jetty-servlet-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\netty-codec-http2-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\netty-codec-redis-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\netty-codec-socks-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\netty-codec-stomp-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\websocket-api-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\animal-sniffer-annotations-1.17.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-hdfs-native-client-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-mapreduce-examples-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-server-common-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-server-router-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-services-core-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\jetty-security-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\netty-resolver-dns-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-mapreduce-client-hs-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-shaded-protobuf_3_7-1.1.1.jar;D:\2025.9\Hadoop\test\4\lib2\jetty-util-ajax-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\netty-codec-haproxy-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\netty-handler-proxy-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\netty-transport-udt-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-mapreduce-client-app-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\netty-codec-memcache-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\netty-transport-rxtx-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\netty-transport-sctp-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\websocket-client-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\websocket-common-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\websocket-server-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-mapreduce-client-core-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-server-web-proxy-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\jackson-jaxrs-json-provider-2.12.7.jar;D:\2025.9\Hadoop\test\4\lib2\jetty-annotations-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\websocket-servlet-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\geronimo-jcache_1.0_spec-1.0-alpha-1.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-mapreduce-client-common-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-server-nodemanager-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-hdfs-native-client-3.3.4-tests.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-mapreduce-client-shuffle-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-mapreduce-client-uploader-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\jackson-module-jaxb-annotations-2.12.7.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-mapreduce-client-jobclient-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-mapreduce-client-hs-plugins-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-mapreduce-client-nativetask-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-applications-mawo-core-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-server-resourcemanager-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\netty-transport-classes-epoll-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-server-sharedcachemanager-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\netty-transport-classes-kqueue-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\javax-websocket-client-impl-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\javax-websocket-server-impl-9.4.43.v20210629.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-mapreduce-client-jobclient-3.3.4-tests.jar;D:\2025.9\Hadoop\test\4\lib2\netty-resolver-dns-classes-macos-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-applications-distributedshell-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-server-timeline-pluginstorage-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\netty-transport-native-unix-common-4.1.77.Final.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-server-applicationhistoryservice-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\hadoop-yarn-applications-unmanaged-am-launcher-3.3.4.jar;D:\2025.9\Hadoop\test\4\lib2\netty-transport-native-kqueue-4.1.77.Final-osx-x86_64.jar;D:\2025.9\Hadoop\test\4\lib2\netty-transport-native-epoll-4.1.77.Final-linux-x86_64.jar;D:\2025.9\Hadoop\test\4\lib2\netty-resolver-dns-native-macos-4.1.77.Final-osx-x86_64.jar;D:\2025.9\Hadoop\test\4\lib2\netty-transport-native-kqueue-4.1.77.Final-osx-aarch_64.jar;D:\2025.9\Hadoop\test\4\lib2\netty-transport-native-epoll-4.1.77.Final-linux-aarch_64.jar;D:\2025.9\Hadoop\test\4\lib2\netty-resolver-dns-native-macos-4.1.77.Final-osx-aarch_64.jar;D:\2025.9\Hadoop\test\4\lib2\listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar WordCount 2025-10-30 01:52:22,466 WARN [main] util.Shell (Shell.java:<clinit>(692)) - Did not find winutils.exe: {} java.io.FileNotFoundException: java.io.FileNotFoundException: Hadoop home directory D:\2025.9\Hadoop\test\4\hadoop-3.3.4\hadoop-3.3.4 does not exist -see https://wiki.apache.org/hadoop/WindowsProblems at org.apache.hadoop.util.Shell.fileNotFoundException(Shell.java:547) at org.apache.hadoop.util.Shell.getHadoopHomeDir(Shell.java:568) at org.apache.hadoop.util.Shell.getQualifiedBin(Shell.java:591) at org.apache.hadoop.util.Shell.<clinit>(Shell.java:688) at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:79) at org.apache.hadoop.conf.Configuration.getBoolean(Configuration.java:1712) at org.apache.hadoop.security.SecurityUtil.setConfigurationInternal(SecurityUtil.java:99) at org.apache.hadoop.security.SecurityUtil.<clinit>(SecurityUtil.java:88) at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:312) at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:300) at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:575) at org.apache.hadoop.mapreduce.task.JobContextImpl.<init>(JobContextImpl.java:72) at org.apache.hadoop.mapreduce.Job.<init>(Job.java:152) at org.apache.hadoop.mapreduce.Job.getInstance(Job.java:195) at org.apache.hadoop.mapreduce.Job.getInstance(Job.java:214) at WordCount.main(WordCount.java:19) Caused by: java.io.FileNotFoundException: Hadoop home directory D:\2025.9\Hadoop\test\4\hadoop-3.3.4\hadoop-3.3.4 does not exist at org.apache.hadoop.util.Shell.checkHadoopHomeInner(Shell.java:490) at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:438) at org.apache.hadoop.util.Shell.<clinit>(Shell.java:515) ... 12 more 2025-10-30 01:52:22,496 WARN [main] util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(60)) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2025-10-30 01:52:28,078 WARN [main] impl.MetricsConfig (MetricsConfig.java:loadFirst(136)) - Cannot locate configuration: tried hadoop-metrics2-jobtracker.properties,hadoop-metrics2.properties 2025-10-30 01:52:28,119 INFO [main] impl.MetricsSystemImpl (MetricsSystemImpl.java:startTimer(378)) - Scheduled Metric snapshot period at 10 second(s). 2025-10-30 01:52:28,120 INFO [main] impl.MetricsSystemImpl (MetricsSystemImpl.java:start(191)) - JobTracker metrics system started Exception in thread "main" java.lang.RuntimeException: java.io.FileNotFoundException: java.io.FileNotFoundException: Hadoop home directory D:\2025.9\Hadoop\test\4\hadoop-3.3.4\hadoop-3.3.4 does not exist -see https://wiki.apache.org/hadoop/WindowsProblems at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:735) at org.apache.hadoop.util.Shell.getSetPermissionCommand(Shell.java:270) at org.apache.hadoop.util.Shell.getSetPermissionCommand(Shell.java:286) at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:978) at org.apache.hadoop.fs.RawLocalFileSystem.mkOneDirWithMode(RawLocalFileSystem.java:660) at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:700) at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:672) at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:699) at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:677) at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:336) at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:162) at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:113) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:148) at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1571) at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1568) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1878) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1568) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1589) at WordCount.main(WordCount.java:28) Caused by: java.io.FileNotFoundException: java.io.FileNotFoundException: Hadoop home directory D:\2025.9\Hadoop\test\4\hadoop-3.3.4\hadoop-3.3.4 does not exist -see https://wiki.apache.org/hadoop/WindowsProblems at org.apache.hadoop.util.Shell.fileNotFoundException(Shell.java:547) at org.apache.hadoop.util.Shell.getHadoopHomeDir(Shell.java:568) at org.apache.hadoop.util.Shell.getQualifiedBin(Shell.java:591) at org.apache.hadoop.util.Shell.<clinit>(Shell.java:688) at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:79) at org.apache.hadoop.conf.Configuration.getBoolean(Configuration.java:1712) at org.apache.hadoop.security.SecurityUtil.setConfigurationInternal(SecurityUtil.java:99) at org.apache.hadoop.security.SecurityUtil.<clinit>(SecurityUtil.java:88) at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:312) at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:300) at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:575) at org.apache.hadoop.mapreduce.task.JobContextImpl.<init>(JobContextImpl.java:72) at org.apache.hadoop.mapreduce.Job.<init>(Job.java:152) at org.apache.hadoop.mapreduce.Job.getInstance(Job.java:195) at org.apache.hadoop.mapreduce.Job.getInstance(Job.java:214) at WordCount.main(WordCount.java:19) Caused by: java.io.FileNotFoundException: Hadoop home directory D:\2025.9\Hadoop\test\4\hadoop-3.3.4\hadoop-3.3.4 does not exist at org.apache.hadoop.util.Shell.checkHadoopHomeInner(Shell.java:490) at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:438) at org.apache.hadoop.util.Shell.<clinit>(Shell.java:515) ... 12 more 进程已结束,退代码1 完善原代码,不需要改进,告知原因还有解决方法
最新发布
10-31
2025-08-23 21:58:38,964 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /server/hadoop/data/nn/current/edits_0000000000000000001-0000000000000000001 maxTxnsToRead = 9223372036854775807 2025-08-23 21:58:38,966 INFO org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream: Fast-forwarding stream '/server/hadoop/data/nn/current/edits_0000000000000000001-0000000000000000001' to transaction ID 1 2025-08-23 21:58:39,048 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Loaded 1 edits file(s) (the last named /server/hadoop/data/nn/current/edits_0000000000000000001-0000000000000000001) of total size 1048576.0, total edits 1.0, total load time 64.0 ms 2025-08-23 21:58:39,050 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Need to save fs image? false (staleImage=false, haEnabled=false, isRollingUpgrade=false) 2025-08-23 21:58:39,050 INFO org.apache.hadoop.hdfs.server.namenode.FSEditLog: Starting log segment at 3 2025-08-23 21:58:39,279 INFO org.apache.hadoop.hdfs.server.namenode.NameCache: initialized with 0 entries 0 lookups 2025-08-23 21:58:39,279 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Finished loading FSImage in 708 msecs 2025-08-23 21:58:39,646 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: RPC server is binding to master:50070 2025-08-23 21:58:39,646 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: Enable NameNode state context:false 2025-08-23 21:58:39,656 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue, queueCapacity: 10000, scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler, ipcBackoff: false. 2025-08-23 21:58:39,673 INFO org.apache.hadoop.ipc.Server: Listener at master:50070 2025-08-23 21:58:39,678 INFO org.apache.hadoop.ipc.Server: Starting Socket Reader #1 for port 50070 2025-08-23 21:58:39,789 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Registered FSNamesystemState, ReplicatedBlocksState and ECBlockGroupsState MBeans. 2025-08-23 21:58:39,791 INFO org.apache.hadoop.hdfs.server.common.Util: Assuming 'file' scheme for path /server/hadoop/data/nn in configuration. 2025-08-23 21:58:40,196 INFO org.apache.hadoop.hdfs.server.namenode.LeaseManager: Number of blocks under construction: 0 2025-08-23 21:58:40,397 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeAdminDefaultMonitor: Initialized the Default Decommission and Maintenance monitor 2025-08-23 21:58:40,400 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Start MarkedDeleteBlockScrubber thread 2025-08-23 21:58:40,402 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: initializing replication queues 2025-08-23 21:58:40,415 INFO org.apache.hadoop.hdfs.StateChange: STATE* Leaving safe mode after 0 secs 2025-08-23 21:58:40,415 INFO org.apache.hadoop.hdfs.StateChange: STATE* Network topology has 0 racks and 0 datanodes 2025-08-23 21:58:40,415 INFO org.apache.hadoop.hdfs.StateChange: STATE* UnderReplicatedBlocks has 0 blocks 2025-08-23 21:58:40,457 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Total number of blocks = 0 2025-08-23 21:58:40,457 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of invalid blocks = 0 2025-08-23 21:58:40,457 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of under-replicated blocks = 0 2025-08-23 21:58:40,457 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of over-replicated blocks = 0 2025-08-23 21:58:40,457 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Number of blocks being written = 0 2025-08-23 21:58:40,457 INFO org.apache.hadoop.hdfs.StateChange: STATE* Replication Queue initialization scan for invalid, over- and under-replicated blocks completed in 19 msec 2025-08-23 21:58:40,481 INFO org.apache.hadoop.ipc.Server: IPC Server Responder: starting 2025-08-23 21:58:40,485 INFO org.apache.hadoop.ipc.Server: IPC Server listener on 50070: starting 2025-08-23 21:58:40,648 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: NameNode RPC up at: master/192.168.88.8:50070 2025-08-23 21:58:40,671 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Starting services required for active state 2025-08-23 21:58:40,671 INFO org.apache.hadoop.hdfs.server.namenode.FSDirectory: Initializing quota with 12 thread(s) 2025-08-23 21:58:40,702 INFO org.apache.hadoop.hdfs.server.namenode.FSDirectory: Quota initialization completed in 30 milliseconds name space=1 storage space=0 storage types=RAM_DISK=0, SSD=0, DISK=0, ARCHIVE=0, PROVIDED=0 2025-08-23 21:58:40,727 INFO org.apache.hadoop.hdfs.server.blockmanagement.CacheReplicationMonitor: Starting CacheReplicationMonitor with interval 30000 milliseconds 2025-08-23 21:58:43,011 INFO org.apache.hadoop.hdfs.StateChange: BLOCK* registerDatanode: from DatanodeRegistration(192.168.88.10:9866, datanodeUuid=da4db3bb-c966-465f-a90f-5d54ae18f35b, infoPort=9864, infoSecurePort=0, ipcPort=9867, storageInfo=lv=-57;cid=CID-b1ee6ae8-73a9-4763-a24f-2c840eac62b2;nsid=122842789;c=1755955721769) storage da4db3bb-c966-465f-a90f-5d54ae18f35b 2025-08-23 21:58:43,014 INFO org.apache.hadoop.net.NetworkTopology: Adding a new node: /default-rack/192.168.88.10:9866 2025-08-23 21:58:43,014 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockReportLeaseManager: Registered DN da4db3bb-c966-465f-a90f-5d54ae18f35b (192.168.88.10:9866). 2025-08-23 21:58:43,032 INFO org.apache.hadoop.hdfs.StateChange: BLOCK* registerDatanode: from DatanodeRegistration(192.168.88.9:9866, datanodeUuid=fbbcb827-5b1b-4acd-a251-97a412da99d1, infoPort=9864, infoSecurePort=0, ipcPort=9867, storageInfo=lv=-57;cid=CID-b1ee6ae8-73a9-4763-a24f-2c840eac62b2;nsid=122842789;c=1755955721769) storage fbbcb827-5b1b-4acd-a251-97a412da99d1 2025-08-23 21:58:43,032 INFO org.apache.hadoop.net.NetworkTopology: Adding a new node: /default-rack/192.168.88.9:9866 2025-08-23 21:58:43,032 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockReportLeaseManager: Registered DN fbbcb827-5b1b-4acd-a251-97a412da99d1 (192.168.88.9:9866). 2025-08-23 21:58:43,156 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Adding new storage ID DS-09455080-1e07-42ac-8c0c-97bd0539bce4 for DN 192.168.88.10:9866 2025-08-23 21:58:43,171 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Adding new storage ID DS-e0af5c38-05e1-40cc-9839-7ca2a9555353 for DN 192.168.88.9:9866 2025-08-23 21:58:43,242 INFO BlockStateChange: BLOCK* processReport 0x95bc76793dd4a335 with lease ID 0xf16c10099c3274d4: Processing first storage report for DS-09455080-1e07-42ac-8c0c-97bd0539bce4 from datanode DatanodeRegistration(192.168.88.10:9866, datanodeUuid=da4db3bb-c966-465f-a90f-5d54ae18f35b, infoPort=9864, infoSecurePort=0, ipcPort=9867, storageInfo=lv=-57;cid=CID-b1ee6ae8-73a9-4763-a24f-2c840eac62b2;nsid=122842789;c=1755955721769) 2025-08-23 21:58:43,244 INFO BlockStateChange: BLOCK* processReport 0x95bc76793dd4a335 with lease ID 0xf16c10099c3274d4: from storage DS-09455080-1e07-42ac-8c0c-97bd0539bce4 node DatanodeRegistration(192.168.88.10:9866, datanodeUuid=da4db3bb-c966-465f-a90f-5d54ae18f35b, infoPort=9864, infoSecurePort=0, ipcPort=9867, storageInfo=lv=-57;cid=CID-b1ee6ae8-73a9-4763-a24f-2c840eac62b2;nsid=122842789;c=1755955721769), blocks: 0, hasStaleStorage: false, processing time: 3 msecs, invalidatedBlocks: 0 2025-08-23 21:58:43,244 INFO BlockStateChange: BLOCK* processReport 0x76705665125f17dc with lease ID 0xf16c10099c3274d5: Processing first storage report for DS-e0af5c38-05e1-40cc-9839-7ca2a9555353 from datanode DatanodeRegistration(192.168.88.9:9866, datanodeUuid=fbbcb827-5b1b-4acd-a251-97a412da99d1, infoPort=9864, infoSecurePort=0, ipcPort=9867, storageInfo=lv=-57;cid=CID-b1ee6ae8-73a9-4763-a24f-2c840eac62b2;nsid=122842789;c=1755955721769) 2025-08-23 21:58:43,245 INFO BlockStateChange: BLOCK* processReport 0x76705665125f17dc with lease ID 0xf16c10099c3274d5: from storage DS-e0af5c38-05e1-40cc-9839-7ca2a9555353 node DatanodeRegistration(192.168.88.9:9866, datanodeUuid=fbbcb827-5b1b-4acd-a251-97a412da99d1, infoPort=9864, infoSecurePort=0, ipcPort=9867, storageInfo=lv=-57;cid=CID-b1ee6ae8-73a9-4763-a24f-2c840eac62b2;nsid=122842789;c=1755955721769), blocks: 0, hasStaleStorage: false, processing time: 0 msecs, invalidatedBlocks: 0 2025-08-23 21:58:44,498 INFO org.apache.hadoop.hdfs.StateChange: BLOCK* registerDatanode: from DatanodeRegistration(192.168.88.8:9866, datanodeUuid=5e3fedcf-07cf-4702-9b47-a4e827a92a5f, infoPort=9864, infoSecurePort=0, ipcPort=9867, storageInfo=lv=-57;cid=CID-b1ee6ae8-73a9-4763-a24f-2c840eac62b2;nsid=122842789;c=1755955721769) storage 5e3fedcf-07cf-4702-9b47-a4e827a92a5f 2025-08-23 21:58:44,498 INFO org.apache.hadoop.net.NetworkTopology: Adding a new node: /default-rack/192.168.88.8:9866 2025-08-23 21:58:44,498 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockReportLeaseManager: Registered DN 5e3fedcf-07cf-4702-9b47-a4e827a92a5f (192.168.88.8:9866). 2025-08-23 21:58:44,548 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeDescriptor: Adding new storage ID DS-0ade9864-83fa-4d01-933d-bd926d937ee1 for DN 192.168.88.8:9866 2025-08-23 21:58:44,589 INFO BlockStateChange: BLOCK* processReport 0xba3c5becc84253a1 with lease ID 0xf16c10099c3274d6: Processing first storage report for DS-0ade9864-83fa-4d01-933d-bd926d937ee1 from datanode DatanodeRegistration(192.168.88.8:9866, datanodeUuid=5e3fedcf-07cf-4702-9b47-a4e827a92a5f, infoPort=9864, infoSecurePort=0, ipcPort=9867, storageInfo=lv=-57;cid=CID-b1ee6ae8-73a9-4763-a24f-2c840eac62b2;nsid=122842789;c=1755955721769) 2025-08-23 21:58:44,590 INFO BlockStateChange: BLOCK* processReport 0xba3c5becc84253a1 with lease ID 0xf16c10099c3274d6: from storage DS-0ade9864-83fa-4d01-933d-bd926d937ee1 node DatanodeRegistration(192.168.88.8:9866, datanodeUuid=5e3fedcf-07cf-4702-9b47-a4e827a92a5f, infoPort=9864, infoSecurePort=0, ipcPort=9867, storageInfo=lv=-57;cid=CID-b1ee6ae8-73a9-4763-a24f-2c840eac62b2;nsid=122842789;c=1755955721769), blocks: 0, hasStaleStorage: false, processing time: 0 msecs, invalidatedBlocks: 0 2025-08-23 22:01:04,977 ERROR org.apache.hadoop.hdfs.server.namenode.NameNode: RECEIVED SIGNAL 15: SIGTERM 2025-08-23 22:01:04,982 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down NameNode at master/192.168.88.8 ************************************************************/ 2025-08-23 22:01:16,701 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: STARTUP_MSG: /************************************************************ STARTUP_MSG: Starting NameNode STARTUP_MSG: host = master/192.168.88.8 STARTUP_MSG: args = [] STARTUP_MSG: version = 3.3.6 STARTUP_MSG: classpath = /server/hadoop/etc/hadoop:/server/hadoop/share/hadoop/common/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/server/hadoop/share/hadoop/common/lib/kerby-pkix-1.0.1.jar:/server/hadoop/share/hadoop/common/lib/jackson-annotations-2.12.7.jar:/server/hadoop/share/hadoop/common/lib/netty-handler-ssl-ocsp-4.1.89.Final.jar:/server/hadoop/share/hadoop/common/lib/metrics-core-3.2.4.jar:/server/hadoop/share/hadoop/common/lib/netty-resolver-4.1.89.Final.jar:/server/hadoop/share/hadoop/common/lib/commons-text-1.10.0.jar:/server/hadoop/share/hadoop/common/lib/netty-transport-4.1.89.Final.jar:/server/hadoop/share/hadoop/common/lib/hadoop-shaded-protobuf_3_7-1.1.1.jar:/server/hadoop/share/hadoop/common/lib/jackson-mapper-asl-1.9.13.jar:/server/hadoop/share/hadoop/common/lib/jetty-server-9.4.51.v20230217.jar:/server/hadoop/share/hadoop/common/lib/netty-common-4.1.89.Final.jar:/server/hadoop/share/hadoop/common/lib/kerby-util-1.0.1.jar:/server/hadoop/share/hadoop/common/lib/hadoop-annotations-3.3.6.jar:/server/hadoop/share/hadoop/common/lib/failureaccess-1.0.jar:/server/hadoop/share/hadoop/common/lib/jersey-json-1.20.jar:/server/hadoop/share/hadoop/common/lib/nimbus-jose-jwt-9.8.1.jar:/server/hadoop/share/hadoop/common/lib/netty-resolver-dns-native-macos-4.1.89.Final-osx-x86_64.jar:/server/hadoop/share/hadoop/common/lib/netty-codec-4.1.89.Final.jar:/server/hadoop/share/hadoop/common/lib/snappy-java-1.1.8.2.jar:/server/hadoop/share/hadoop/common/lib/kerb-identity-1.0.1.jar:/server/hadoop/share/hadoop/common/lib/kerb-util-1.0.1.jar:/server/hadoop/share/hadoop/common/lib/kerb-client-1.0.1.jar:/server/hadoop/share/hadoop/common/lib/gson-2.9.0.jar:/server/hadoop/share/hadoop/common/lib/commons-logging-1.1.3.jar:/server/hadoop/share/hadoop/common/lib/netty-transport-native-unix-common-4.1.89.Final.jar:/server/hadoop/share/hadoop/common/lib/netty-codec-redis-4.1.89.Final.jar:/server/hadoop/share/hadoop/common/lib/netty-codec-http2-4.1.89.Final.jar:/server/hadoop/share/hadoop/common/lib/netty-transport-native-kqueue-4.1.89.Final-osx-aarch_64.jar:/server/hadoop/share/hadoop/common/lib/jetty-util-9.4.51.v20230217.jar:/server/hadoop/share/hadoop/common/lib/jsp-api-2.1.jar:/server/hadoop/share/hadoop/common/lib/jetty-util-ajax-9.4.51.v20230217.jar:/server/hadoop/share/hadoop/common/lib/zookeeper-3.6.3.jar:/server/hadoop/share/hadoop/common/lib/guava-27.0-jre.jar:/server/hadoop/share/hadoop/common/lib/httpcore-4.4.13.jar:/server/hadoop/share/hadoop/common/lib/netty-transport-rxtx-4.1.89.Final.jar:/server/hadoop/share/hadoop/common/lib/kerb-admin-1.0.1.jar:/server/hadoop/share/hadoop/common/lib/curator-client-5.2.0.jar:/server/hadoop/share/hadoop/common/lib/netty-buffer-4.1.89.Final.jar:/server/hadoop/share/hadoop/common/lib/netty-transport-native-epoll-4.1.89.Final-linux-x86_64.jar:/server/hadoop/share/hadoop/common/lib/jcip-annotations-1.0-1.jar:/server/hadoop/share/hadoop/common/lib/netty-codec-xml-4.1.89.Final.jar:/server/hadoop/share/hadoop/common/lib/javax.servlet-api-3.1.0.jar:/server/hadoop/share/hadoop/common/lib/netty-resolver-dns-classes-macos-4.1.89.Final.jar:/server/hadoop/share/hadoop/common/lib/curator-recipes-5.2.0.jar:/server/hadoop/share/hadoop/common/lib/commons-net-3.9.0.jar:/server/hadoop/share/hadoop/common/lib/jackson-databind-2.12.7.1.jar:/server/hadoop/share/hadoop/common/lib/commons-beanutils-1.9.4.jar:/server/hadoop/share/hadoop/common/lib/netty-resolver-dns-native-macos-4.1.89.Final-osx-aarch_64.jar:/server/hadoop/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/server/hadoop/share/hadoop/common/lib/jettison-1.5.4.jar:/server/hadoop/share/hadoop/common/lib/slf4j-api-1.7.36.jar:/server/hadoop/share/hadoop/common/lib/jsr305-3.0.2.jar:/server/hadoop/share/hadoop/common/lib/netty-codec-dns-4.1.89.Final.jar:/server/hadoop/share/hadoop/common/lib/animal-sniffer-annotations-1.17.jar:/server/hadoop/share/hadoop/common/lib/kerb-core-1.0.1.jar:/server/hadoop/share/hadoop/common/lib/jetty-servlet-9.4.51.v20230217.jar:/server/hadoop/share/hadoop/common/lib/netty-codec-socks-4.1.89.Final.jar:/server/hadoop/share/hadoop/common/lib/commons-io-2.8.0.jar:/server/hadoop/share/hadoop/common/lib/netty-transport-classes-epoll-4.1.89.Final.jar:/server/hadoop/share/hadoop/common/lib/re2j-1.1.jar:/server/hadoop/share/hadoop/common/lib/netty-handler-4.1.89.Final.jar:/server/hadoop/share/hadoop/common/lib/commons-daemon-1.0.13.jar:/server/hadoop/share/hadoop/common/lib/slf4j-reload4j-1.7.36.jar:/server/hadoop/share/hadoop/common/lib/commons-codec-1.15.jar:/server/hadoop/share/hadoop/common/lib/woodstox-core-5.4.0.jar:/server/hadoop/share/hadoop/common/lib/netty-codec-stomp-4.1.89.Final.jar:/server/hadoop/share/hadoop/common/lib/zookeeper-jute-3.6.3.jar:/server/hadoop/share/hadoop/common/lib/kerby-asn1-1.0.1.jar:/server/hadoop/share/hadoop/common/lib/kerby-config-1.0.1.jar:/server/hadoop/share/hadoop/common/lib/jackson-core-asl-1.9.13.jar:/server/hadoop/share/hadoop/common/lib/stax2-api-4.2.1.jar:/server/hadoop/share/hadoop/common/lib/netty-codec-http-4.1.89.Final.jar:/server/hadoop/share/hadoop/common/lib/kerb-crypto-1.0.1.jar:/server/hadoop/share/hadoop/common/lib/jakarta.activation-api-1.2.1.jar:/server/hadoop/share/hadoop/common/lib/netty-transport-classes-kqueue-4.1.89.Final.jar:/server/hadoop/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/server/hadoop/share/hadoop/common/lib/hadoop-shaded-guava-1.1.1.jar:/server/hadoop/share/hadoop/common/lib/jackson-core-2.12.7.jar:/server/hadoop/share/hadoop/common/lib/commons-configuration2-2.8.0.jar:/server/hadoop/share/hadoop/common/lib/jaxb-api-2.2.11.jar:/server/hadoop/share/hadoop/common/lib/jetty-security-9.4.51.v20230217.jar:/server/hadoop/share/hadoop/common/lib/token-provider-1.0.1.jar:/server/hadoop/share/hadoop/common/lib/reload4j-1.2.22.jar:/server/hadoop/share/hadoop/common/lib/netty-codec-haproxy-4.1.89.Final.jar:/server/hadoop/share/hadoop/common/lib/curator-framework-5.2.0.jar:/server/hadoop/share/hadoop/common/lib/jsch-0.1.55.jar:/server/hadoop/share/hadoop/common/lib/netty-transport-native-kqueue-4.1.89.Final-osx-x86_64.jar:/server/hadoop/share/hadoop/common/lib/checker-qual-2.5.2.jar:/server/hadoop/share/hadoop/common/lib/netty-codec-memcache-4.1.89.Final.jar:/server/hadoop/share/hadoop/common/lib/jetty-http-9.4.51.v20230217.jar:/server/hadoop/share/hadoop/common/lib/netty-handler-proxy-4.1.89.Final.jar:/server/hadoop/share/hadoop/common/lib/netty-transport-native-epoll-4.1.89.Final-linux-aarch_64.jar:/server/hadoop/share/hadoop/common/lib/jetty-webapp-9.4.51.v20230217.jar:/server/hadoop/share/hadoop/common/lib/jersey-server-1.19.4.jar:/server/hadoop/share/hadoop/common/lib/netty-transport-sctp-4.1.89.Final.jar:/server/hadoop/share/hadoop/common/lib/netty-resolver-dns-4.1.89.Final.jar:/server/hadoop/share/hadoop/common/lib/kerby-xdr-1.0.1.jar:/server/hadoop/share/hadoop/common/lib/jul-to-slf4j-1.7.36.jar:/server/hadoop/share/hadoop/common/lib/commons-lang3-3.12.0.jar:/server/hadoop/share/hadoop/common/lib/kerb-common-1.0.1.jar:/server/hadoop/share/hadoop/common/lib/jetty-io-9.4.51.v20230217.jar:/server/hadoop/share/hadoop/common/lib/netty-transport-udt-4.1.89.Final.jar:/server/hadoop/share/hadoop/common/lib/kerb-simplekdc-1.0.1.jar:/server/hadoop/share/hadoop/common/lib/jetty-xml-9.4.51.v20230217.jar:/server/hadoop/share/hadoop/common/lib/paranamer-2.3.jar:/server/hadoop/share/hadoop/common/lib/commons-math3-3.1.1.jar:/server/hadoop/share/hadoop/common/lib/audience-annotations-0.5.0.jar:/server/hadoop/share/hadoop/common/lib/jersey-servlet-1.19.4.jar:/server/hadoop/share/hadoop/common/lib/netty-codec-smtp-4.1.89.Final.jar:/server/hadoop/share/hadoop/common/lib/dnsjava-2.1.7.jar:/server/hadoop/share/hadoop/common/lib/commons-collections-3.2.2.jar:/server/hadoop/share/hadoop/common/lib/netty-codec-mqtt-4.1.89.Final.jar:/server/hadoop/share/hadoop/common/lib/kerb-server-1.0.1.jar:/server/hadoop/share/hadoop/common/lib/httpclient-4.5.13.jar:/server/hadoop/share/hadoop/common/lib/netty-all-4.1.89.Final.jar:/server/hadoop/share/hadoop/common/lib/j2objc-annotations-1.1.jar:/server/hadoop/share/hadoop/common/lib/jersey-core-1.19.4.jar:/server/hadoop/share/hadoop/common/lib/commons-cli-1.2.jar:/server/hadoop/share/hadoop/common/lib/hadoop-auth-3.3.6.jar:/server/hadoop/share/hadoop/common/lib/jsr311-api-1.1.1.jar:/server/hadoop/share/hadoop/common/lib/commons-compress-1.21.jar:/server/hadoop/share/hadoop/common/lib/avro-1.7.7.jar:/server/hadoop/share/hadoop/common/hadoop-registry-3.3.6.jar:/server/hadoop/share/hadoop/common/hadoop-common-3.3.6.jar:/server/hadoop/share/hadoop/common/hadoop-kms-3.3.6.jar:/server/hadoop/share/hadoop/common/hadoop-nfs-3.3.6.jar:/server/hadoop/share/hadoop/common/hadoop-common-3.3.6-tests.jar:/server/hadoop/share/hadoop/hdfs:/server/hadoop/share/hadoop/hdfs/lib/listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar:/server/hadoop/share/hadoop/hdfs/lib/kerby-pkix-1.0.1.jar:/server/hadoop/share/hadoop/hdfs/lib/jackson-annotations-2.12.7.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-handler-ssl-ocsp-4.1.89.Final.jar:/server/hadoop/share/hadoop/hdfs/lib/metrics-core-3.2.4.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-resolver-4.1.89.Final.jar:/server/hadoop/share/hadoop/hdfs/lib/commons-text-1.10.0.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-transport-4.1.89.Final.jar:/server/hadoop/share/hadoop/hdfs/lib/hadoop-shaded-protobuf_3_7-1.1.1.jar:/server/hadoop/share/hadoop/hdfs/lib/jackson-mapper-asl-1.9.13.jar:/server/hadoop/share/hadoop/hdfs/lib/jetty-server-9.4.51.v20230217.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-common-4.1.89.Final.jar:/server/hadoop/share/hadoop/hdfs/lib/kerby-util-1.0.1.jar:/server/hadoop/share/hadoop/hdfs/lib/hadoop-annotations-3.3.6.jar:/server/hadoop/share/hadoop/hdfs/lib/failureaccess-1.0.jar:/server/hadoop/share/hadoop/hdfs/lib/jersey-json-1.20.jar:/server/hadoop/share/hadoop/hdfs/lib/nimbus-jose-jwt-9.8.1.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-resolver-dns-native-macos-4.1.89.Final-osx-x86_64.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-codec-4.1.89.Final.jar:/server/hadoop/share/hadoop/hdfs/lib/snappy-java-1.1.8.2.jar:/server/hadoop/share/hadoop/hdfs/lib/okio-2.8.0.jar:/server/hadoop/share/hadoop/hdfs/lib/kerb-identity-1.0.1.jar:/server/hadoop/share/hadoop/hdfs/lib/kerb-util-1.0.1.jar:/server/hadoop/share/hadoop/hdfs/lib/kerb-client-1.0.1.jar:/server/hadoop/share/hadoop/hdfs/lib/gson-2.9.0.jar:/server/hadoop/share/hadoop/hdfs/lib/kotlin-stdlib-1.4.10.jar:/server/hadoop/share/hadoop/hdfs/lib/commons-logging-1.1.3.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-transport-native-unix-common-4.1.89.Final.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-codec-redis-4.1.89.Final.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-codec-http2-4.1.89.Final.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-transport-native-kqueue-4.1.89.Final-osx-aarch_64.jar:/server/hadoop/share/hadoop/hdfs/lib/jetty-util-9.4.51.v20230217.jar:/server/hadoop/share/hadoop/hdfs/lib/jetty-util-ajax-9.4.51.v20230217.jar:/server/hadoop/share/hadoop/hdfs/lib/zookeeper-3.6.3.jar:/server/hadoop/share/hadoop/hdfs/lib/guava-27.0-jre.jar:/server/hadoop/share/hadoop/hdfs/lib/httpcore-4.4.13.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-transport-rxtx-4.1.89.Final.jar:/server/hadoop/share/hadoop/hdfs/lib/kerb-admin-1.0.1.jar:/server/hadoop/share/hadoop/hdfs/lib/curator-client-5.2.0.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-buffer-4.1.89.Final.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-transport-native-epoll-4.1.89.Final-linux-x86_64.jar:/server/hadoop/share/hadoop/hdfs/lib/jcip-annotations-1.0-1.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-codec-xml-4.1.89.Final.jar:/server/hadoop/share/hadoop/hdfs/lib/javax.servlet-api-3.1.0.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-resolver-dns-classes-macos-4.1.89.Final.jar:/server/hadoop/share/hadoop/hdfs/lib/curator-recipes-5.2.0.jar:/server/hadoop/share/hadoop/hdfs/lib/commons-net-3.9.0.jar:/server/hadoop/share/hadoop/hdfs/lib/jackson-databind-2.12.7.1.jar:/server/hadoop/share/hadoop/hdfs/lib/commons-beanutils-1.9.4.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-resolver-dns-native-macos-4.1.89.Final-osx-aarch_64.jar:/server/hadoop/share/hadoop/hdfs/lib/jaxb-impl-2.2.3-1.jar:/server/hadoop/share/hadoop/hdfs/lib/jettison-1.5.4.jar:/server/hadoop/share/hadoop/hdfs/lib/jsr305-3.0.2.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-codec-dns-4.1.89.Final.jar:/server/hadoop/share/hadoop/hdfs/lib/animal-sniffer-annotations-1.17.jar:/server/hadoop/share/hadoop/hdfs/lib/kerb-core-1.0.1.jar:/server/hadoop/share/hadoop/hdfs/lib/jetty-servlet-9.4.51.v20230217.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-codec-socks-4.1.89.Final.jar:/server/hadoop/share/hadoop/hdfs/lib/commons-io-2.8.0.jar:/server/hadoop/share/hadoop/hdfs/lib/leveldbjni-all-1.8.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-transport-classes-epoll-4.1.89.Final.jar:/server/hadoop/share/hadoop/hdfs/lib/re2j-1.1.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-handler-4.1.89.Final.jar:/server/hadoop/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/server/hadoop/share/hadoop/hdfs/lib/commons-codec-1.15.jar:/server/hadoop/share/hadoop/hdfs/lib/woodstox-core-5.4.0.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-codec-stomp-4.1.89.Final.jar:/server/hadoop/share/hadoop/hdfs/lib/zookeeper-jute-3.6.3.jar:/server/hadoop/share/hadoop/hdfs/lib/kerby-asn1-1.0.1.jar:/server/hadoop/share/hadoop/hdfs/lib/kerby-config-1.0.1.jar:/server/hadoop/share/hadoop/hdfs/lib/jackson-core-asl-1.9.13.jar:/server/hadoop/share/hadoop/hdfs/lib/stax2-api-4.2.1.jar:/server/hadoop/share/hadoop/hdfs/lib/kotlin-stdlib-common-1.4.10.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-codec-http-4.1.89.Final.jar:/server/hadoop/share/hadoop/hdfs/lib/kerb-crypto-1.0.1.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-3.10.6.Final.jar:/server/hadoop/share/hadoop/hdfs/lib/jakarta.activation-api-1.2.1.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-transport-classes-kqueue-4.1.89.Final.jar:/server/hadoop/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/server/hadoop/share/hadoop/hdfs/lib/hadoop-shaded-guava-1.1.1.jar:/server/hadoop/share/hadoop/hdfs/lib/jackson-core-2.12.7.jar:/server/hadoop/share/hadoop/hdfs/lib/okhttp-4.9.3.jar:/server/hadoop/share/hadoop/hdfs/lib/commons-configuration2-2.8.0.jar:/server/hadoop/share/hadoop/hdfs/lib/jaxb-api-2.2.11.jar:/server/hadoop/share/hadoop/hdfs/lib/jetty-security-9.4.51.v20230217.jar:/server/hadoop/share/hadoop/hdfs/lib/token-provider-1.0.1.jar:/server/hadoop/share/hadoop/hdfs/lib/reload4j-1.2.22.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-codec-haproxy-4.1.89.Final.jar:/server/hadoop/share/hadoop/hdfs/lib/curator-framework-5.2.0.jar:/server/hadoop/share/hadoop/hdfs/lib/jsch-0.1.55.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-transport-native-kqueue-4.1.89.Final-osx-x86_64.jar:/server/hadoop/share/hadoop/hdfs/lib/checker-qual-2.5.2.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-codec-memcache-4.1.89.Final.jar:/server/hadoop/share/hadoop/hdfs/lib/jetty-http-9.4.51.v20230217.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-handler-proxy-4.1.89.Final.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-transport-native-epoll-4.1.89.Final-linux-aarch_64.jar:/server/hadoop/share/hadoop/hdfs/lib/jetty-webapp-9.4.51.v20230217.jar:/server/hadoop/share/hadoop/hdfs/lib/jersey-server-1.19.4.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-transport-sctp-4.1.89.Final.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-resolver-dns-4.1.89.Final.jar:/server/hadoop/share/hadoop/hdfs/lib/kerby-xdr-1.0.1.jar:/server/hadoop/share/hadoop/hdfs/lib/HikariCP-java7-2.4.12.jar:/server/hadoop/share/hadoop/hdfs/lib/commons-lang3-3.12.0.jar:/server/hadoop/share/hadoop/hdfs/lib/kerb-common-1.0.1.jar:/server/hadoop/share/hadoop/hdfs/lib/jetty-io-9.4.51.v20230217.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-transport-udt-4.1.89.Final.jar:/server/hadoop/share/hadoop/hdfs/lib/kerb-simplekdc-1.0.1.jar:/server/hadoop/share/hadoop/hdfs/lib/jetty-xml-9.4.51.v20230217.jar:/server/hadoop/share/hadoop/hdfs/lib/paranamer-2.3.jar:/server/hadoop/share/hadoop/hdfs/lib/commons-math3-3.1.1.jar:/server/hadoop/share/hadoop/hdfs/lib/audience-annotations-0.5.0.jar:/server/hadoop/share/hadoop/hdfs/lib/jersey-servlet-1.19.4.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-codec-smtp-4.1.89.Final.jar:/server/hadoop/share/hadoop/hdfs/lib/dnsjava-2.1.7.jar:/server/hadoop/share/hadoop/hdfs/lib/commons-collections-3.2.2.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-codec-mqtt-4.1.89.Final.jar:/server/hadoop/share/hadoop/hdfs/lib/kerb-server-1.0.1.jar:/server/hadoop/share/hadoop/hdfs/lib/json-simple-1.1.1.jar:/server/hadoop/share/hadoop/hdfs/lib/httpclient-4.5.13.jar:/server/hadoop/share/hadoop/hdfs/lib/netty-all-4.1.89.Final.jar:/server/hadoop/share/hadoop/hdfs/lib/j2objc-annotations-1.1.jar:/server/hadoop/share/hadoop/hdfs/lib/jersey-core-1.19.4.jar:/server/hadoop/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/server/hadoop/share/hadoop/hdfs/lib/hadoop-auth-3.3.6.jar:/server/hadoop/share/hadoop/hdfs/lib/jsr311-api-1.1.1.jar:/server/hadoop/share/hadoop/hdfs/lib/commons-compress-1.21.jar:/server/hadoop/share/hadoop/hdfs/lib/avro-1.7.7.jar:/server/hadoop/share/hadoop/hdfs/hadoop-hdfs-rbf-3.3.6-tests.jar:/server/hadoop/share/hadoop/hdfs/hadoop-hdfs-nfs-3.3.6.jar:/server/hadoop/share/hadoop/hdfs/hadoop-hdfs-native-client-3.3.6.jar:/server/hadoop/share/hadoop/hdfs/hadoop-hdfs-native-client-3.3.6-tests.jar:/server/hadoop/share/hadoop/hdfs/hadoop-hdfs-client-3.3.6.jar:/server/hadoop/share/hadoop/hdfs/hadoop-hdfs-rbf-3.3.6.jar:/server/hadoop/share/hadoop/hdfs/hadoop-hdfs-httpfs-3.3.6.jar:/server/hadoop/share/hadoop/hdfs/hadoop-hdfs-3.3.6.jar:/server/hadoop/share/hadoop/hdfs/hadoop-hdfs-3.3.6-tests.jar:/server/hadoop/share/hadoop/hdfs/hadoop-hdfs-client-3.3.6-tests.jar:/server/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-3.3.6.jar:/server/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-common-3.3.6.jar:/server/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-3.3.6-tests.jar:/server/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-uploader-3.3.6.jar:/server/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-nativetask-3.3.6.jar:/server/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-examples-3.3.6.jar:/server/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-app-3.3.6.jar:/server/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-3.3.6.jar:/server/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-3.3.6.jar:/server/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-core-3.3.6.jar:/server/hadoop/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-3.3.6.jar:/server/hadoop/share/hadoop/yarn:/server/hadoop/share/hadoop/yarn/lib/snakeyaml-2.0.jar:/server/hadoop/share/hadoop/yarn/lib/bcpkix-jdk15on-1.68.jar:/server/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-json-provider-2.12.7.jar:/server/hadoop/share/hadoop/yarn/lib/jetty-client-9.4.51.v20230217.jar:/server/hadoop/share/hadoop/yarn/lib/jersey-client-1.19.4.jar:/server/hadoop/share/hadoop/yarn/lib/javax-websocket-client-impl-9.4.51.v20230217.jar:/server/hadoop/share/hadoop/yarn/lib/jline-3.9.0.jar:/server/hadoop/share/hadoop/yarn/lib/javax.websocket-client-api-1.0.jar:/server/hadoop/share/hadoop/yarn/lib/swagger-annotations-1.5.4.jar:/server/hadoop/share/hadoop/yarn/lib/mssql-jdbc-6.2.1.jre7.jar:/server/hadoop/share/hadoop/yarn/lib/guice-servlet-4.0.jar:/server/hadoop/share/hadoop/yarn/lib/jetty-annotations-9.4.51.v20230217.jar:/server/hadoop/share/hadoop/yarn/lib/jetty-jndi-9.4.51.v20230217.jar:/server/hadoop/share/hadoop/yarn/lib/aopalliance-1.0.jar:/server/hadoop/share/hadoop/yarn/lib/jackson-jaxrs-base-2.12.7.jar:/server/hadoop/share/hadoop/yarn/lib/json-io-2.5.1.jar:/server/hadoop/share/hadoop/yarn/lib/javax.inject-1.jar:/server/hadoop/share/hadoop/yarn/lib/objenesis-2.6.jar:/server/hadoop/share/hadoop/yarn/lib/asm-tree-9.4.jar:/server/hadoop/share/hadoop/yarn/lib/geronimo-jcache_1.0_spec-1.0-alpha-1.jar:/server/hadoop/share/hadoop/yarn/lib/websocket-common-9.4.51.v20230217.jar:/server/hadoop/share/hadoop/yarn/lib/jetty-plus-9.4.51.v20230217.jar:/server/hadoop/share/hadoop/yarn/lib/jna-5.2.0.jar:/server/hadoop/share/hadoop/yarn/lib/javax.websocket-api-1.0.jar:/server/hadoop/share/hadoop/yarn/lib/javax-websocket-server-impl-9.4.51.v20230217.jar:/server/hadoop/share/hadoop/yarn/lib/websocket-api-9.4.51.v20230217.jar:/server/hadoop/share/hadoop/yarn/lib/websocket-client-9.4.51.v20230217.jar:/server/hadoop/share/hadoop/yarn/lib/jackson-module-jaxb-annotations-2.12.7.jar:/server/hadoop/share/hadoop/yarn/lib/bcprov-jdk15on-1.68.jar:/server/hadoop/share/hadoop/yarn/lib/asm-commons-9.4.jar:/server/hadoop/share/hadoop/yarn/lib/guice-4.0.jar:/server/hadoop/share/hadoop/yarn/lib/websocket-servlet-9.4.51.v20230217.jar:/server/hadoop/share/hadoop/yarn/lib/websocket-server-9.4.51.v20230217.jar:/server/hadoop/share/hadoop/yarn/lib/jakarta.xml.bind-api-2.3.2.jar:/server/hadoop/share/hadoop/yarn/lib/fst-2.50.jar:/server/hadoop/share/hadoop/yarn/lib/jersey-guice-1.19.4.jar:/server/hadoop/share/hadoop/yarn/lib/ehcache-3.3.1.jar:/server/hadoop/share/hadoop/yarn/lib/java-util-1.9.0.jar:/server/hadoop/share/hadoop/yarn/hadoop-yarn-registry-3.3.6.jar:/server/hadoop/share/hadoop/yarn/hadoop-yarn-server-nodemanager-3.3.6.jar:/server/hadoop/share/hadoop/yarn/hadoop-yarn-server-sharedcachemanager-3.3.6.jar:/server/hadoop/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-3.3.6.jar:/server/hadoop/share/hadoop/yarn/hadoop-yarn-common-3.3.6.jar:/server/hadoop/share/hadoop/yarn/hadoop-yarn-server-common-3.3.6.jar:/server/hadoop/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-3.3.6.jar:/server/hadoop/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-3.3.6.jar:/server/hadoop/share/hadoop/yarn/hadoop-yarn-server-web-proxy-3.3.6.jar:/server/hadoop/share/hadoop/yarn/hadoop-yarn-server-timeline-pluginstorage-3.3.6.jar:/server/hadoop/share/hadoop/yarn/hadoop-yarn-server-tests-3.3.6.jar:/server/hadoop/share/hadoop/yarn/hadoop-yarn-client-3.3.6.jar:/server/hadoop/share/hadoop/yarn/hadoop-yarn-applications-mawo-core-3.3.6.jar:/server/hadoop/share/hadoop/yarn/hadoop-yarn-server-router-3.3.6.jar:/server/hadoop/share/hadoop/yarn/hadoop-yarn-services-api-3.3.6.jar:/server/hadoop/share/hadoop/yarn/hadoop-yarn-api-3.3.6.jar:/server/hadoop/share/hadoop/yarn/hadoop-yarn-services-core-3.3.6.jar:/server/hadoop/share/hadoop/yarn/hadoop-yarn-server-applicationhistoryservice-3.3.6.jar STARTUP_MSG: build = https://github.com/apache/hadoop.git -r 1be78238728da9266a4f88195058f08fd012bf9c; compiled by 'ubuntu' on 2023-06-18T08:22Z STARTUP_MSG: java = 1.8.0_401 ************************************************************/ 2025-08-23 22:01:16,708 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: registered UNIX signal handlers for [TERM, HUP, INT] 2025-08-23 22:01:16,801 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: createNameNode [] 2025-08-23 22:01:16,913 INFO org.apache.hadoop.metrics2.impl.MetricsConfig: Loaded properties from hadoop-metrics2.properties 2025-08-23 22:01:16,997 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled Metric snapshot period at 10 second(s). 2025-08-23 22:01:16,997 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system started 2025-08-23 22:01:17,072 INFO org.apache.hadoop.hdfs.server.namenode.NameNodeUtils: fs.defaultFS is hdfs://master:8020 2025-08-23 22:01:17,072 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: Clients should use master:8020 to access this namenode/service. 2025-08-23 22:01:17,291 INFO org.apache.hadoop.util.JvmPauseMonitor: Starting JVM pause monitor 2025-08-23 22:01:17,380 INFO org.apache.hadoop.hdfs.DFSUtil: Filter initializers set : org.apache.hadoop.http.lib.StaticUserWebFilter,org.apache.hadoop.hdfs.web.AuthFilterInitializer 2025-08-23 22:01:17,386 INFO org.apache.hadoop.hdfs.DFSUtil: Starting Web-server for hdfs at: http://0.0.0.0:9870 2025-08-23 22:01:17,398 INFO org.eclipse.jetty.util.log: Logging initialized @1267ms to org.eclipse.jetty.util.log.Slf4jLog 2025-08-23 22:01:17,499 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /root/hadoop-http-auth-signature-secret 2025-08-23 22:01:17,515 INFO org.apache.hadoop.http.HttpRequestLog: Http request log for http.requests.namenode is not defined 2025-08-23 22:01:17,522 INFO org.apache.hadoop.http.HttpServer2: Added global filter 'safety' (class=org.apache.hadoop.http.HttpServer2$QuotingInputFilter) 2025-08-23 22:01:17,524 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context hdfs 2025-08-23 22:01:17,524 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context static 2025-08-23 22:01:17,524 INFO org.apache.hadoop.http.HttpServer2: Added filter static_user_filter (class=org.apache.hadoop.http.lib.StaticUserWebFilter$StaticUserFilter) to context logs 2025-08-23 22:01:17,526 INFO org.apache.hadoop.http.HttpServer2: Added filter AuthFilter (class=org.apache.hadoop.hdfs.web.AuthFilter) to context hdfs 2025-08-23 22:01:17,526 INFO org.apache.hadoop.http.HttpServer2: Added filter AuthFilter (class=org.apache.hadoop.hdfs.web.AuthFilter) to context static 2025-08-23 22:01:17,526 INFO org.apache.hadoop.http.HttpServer2: Added filter AuthFilter (class=org.apache.hadoop.hdfs.web.AuthFilter) to context logs 2025-08-23 22:01:17,559 INFO org.apache.hadoop.http.HttpServer2: addJerseyResourcePackage: packageName=org.apache.hadoop.hdfs.server.namenode.web.resources;org.apache.hadoop.hdfs.web.resources, pathSpec=/webhdfs/v1/* 2025-08-23 22:01:17,569 INFO org.apache.hadoop.http.HttpServer2: Jetty bound to port 9870 2025-08-23 22:01:17,571 INFO org.eclipse.jetty.server.Server: jetty-9.4.51.v20230217; built: 2023-02-17T08:19:37.309Z; git: b45c405e4544384de066f814ed42ae3dceacdd49; jvm 1.8.0_401-b10 2025-08-23 22:01:17,598 INFO org.eclipse.jetty.server.session: DefaultSessionIdManager workerName=node0 2025-08-23 22:01:17,598 INFO org.eclipse.jetty.server.session: No SessionScavenger set, using defaults 2025-08-23 22:01:17,599 INFO org.eclipse.jetty.server.session: node0 Scavenging every 660000ms 2025-08-23 22:01:17,618 WARN org.apache.hadoop.security.authentication.server.AuthenticationFilter: Unable to initialize FileSignerSecretProvider, falling back to use random secrets. Reason: Could not read signature secret file: /root/hadoop-http-auth-signature-secret 2025-08-23 22:01:17,621 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@465232e9{logs,/logs,file:///server/hadoop/logs/,AVAILABLE} 2025-08-23 22:01:17,622 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.s.ServletContextHandler@7486b455{static,/static,file:///server/hadoop/share/hadoop/hdfs/webapps/static/,AVAILABLE} 2025-08-23 22:01:17,691 INFO org.eclipse.jetty.server.handler.ContextHandler: Started o.e.j.w.WebAppContext@2d6c53fc{hdfs,/,file:///server/hadoop/share/hadoop/hdfs/webapps/hdfs/,AVAILABLE}{file:/server/hadoop/share/hadoop/hdfs/webapps/hdfs} 2025-08-23 22:01:17,702 INFO org.eclipse.jetty.server.AbstractConnector: Started ServerConnector@21d03963{HTTP/1.1, (http/1.1)}{0.0.0.0:9870} 2025-08-23 22:01:17,702 INFO org.eclipse.jetty.server.Server: Started @1571ms 2025-08-23 22:01:18,186 INFO org.apache.hadoop.hdfs.server.common.Util: Assuming 'file' scheme for path /server/hadoop/data/nn in configuration. 2025-08-23 22:01:18,186 INFO org.apache.hadoop.hdfs.server.common.Util: Assuming 'file' scheme for path /server/hadoop/data/nn in configuration. 2025-08-23 22:01:18,186 WARN org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one image storage directory (dfs.namenode.name.dir) configured. Beware of data loss due to lack of redundant storage directories! 2025-08-23 22:01:18,186 WARN org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Only one namespace edits storage directory (dfs.namenode.edits.dir) configured. Beware of data loss due to lack of redundant storage directories! 2025-08-23 22:01:18,192 INFO org.apache.hadoop.hdfs.server.common.Util: Assuming 'file' scheme for path /server/hadoop/data/nn in configuration. 2025-08-23 22:01:18,192 INFO org.apache.hadoop.hdfs.server.common.Util: Assuming 'file' scheme for path /server/hadoop/data/nn in configuration. 2025-08-23 22:01:18,245 INFO org.apache.hadoop.hdfs.server.namenode.FSEditLog: Edit logging is async:true 2025-08-23 22:01:18,277 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: KeyProvider: null 2025-08-23 22:01:18,280 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsLock is fair: true 2025-08-23 22:01:18,280 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Detailed lock hold time metrics enabled: false 2025-08-23 22:01:18,288 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: fsOwner = root (auth:SIMPLE) 2025-08-23 22:01:18,288 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: supergroup = supergroup 2025-08-23 22:01:18,288 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: isPermissionEnabled = true 2025-08-23 22:01:18,288 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: isStoragePolicyEnabled = true 2025-08-23 22:01:18,289 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: HA Enabled: false 2025-08-23 22:01:18,344 INFO org.apache.hadoop.hdfs.server.common.Util: dfs.datanode.fileio.profiling.sampling.percentage set to 0. Disabling file IO profiling 2025-08-23 22:01:18,528 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager: dfs.block.invalidate.limit : configured=1000, counted=60, effected=1000 2025-08-23 22:01:18,528 INFO org.apache.hadoop.hdfs.server.blockmanagement.DatanodeManager: dfs.namenode.datanode.registration.ip-hostname-check=true 2025-08-23 22:01:18,532 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: dfs.namenode.startup.delay.block.deletion.sec is set to 000:00:00:00.000 2025-08-23 22:01:18,532 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: The block deletion will start around 2025 八月 23 22:01:18 2025-08-23 22:01:18,535 INFO org.apache.hadoop.util.GSet: Computing capacity for map BlocksMap 2025-08-23 22:01:18,535 INFO org.apache.hadoop.util.GSet: VM type = 64-bit 2025-08-23 22:01:18,536 INFO org.apache.hadoop.util.GSet: 2.0% max memory 752 MB = 15.0 MB 2025-08-23 22:01:18,536 INFO org.apache.hadoop.util.GSet: capacity = 2^21 = 2097152 entries 2025-08-23 22:01:18,546 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: Storage policy satisfier is disabled 2025-08-23 22:01:18,546 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: dfs.block.access.token.enable = false 2025-08-23 22:01:18,564 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManagerSafeMode: dfs.namenode.safemode.threshold-pct = 0.999 2025-08-23 22:01:18,565 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManagerSafeMode: dfs.namenode.safemode.min.datanodes = 0 2025-08-23 22:01:18,565 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManagerSafeMode: dfs.namenode.safemode.extension = 30000 2025-08-23 22:01:18,565 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: defaultReplication = 3 2025-08-23 22:01:18,566 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplication = 512 2025-08-23 22:01:18,566 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: minReplication = 1 2025-08-23 22:01:18,566 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxReplicationStreams = 2 2025-08-23 22:01:18,566 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: redundancyRecheckInterval = 3000ms 2025-08-23 22:01:18,566 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: encryptDataTransfer = false 2025-08-23 22:01:18,566 INFO org.apache.hadoop.hdfs.server.blockmanagement.BlockManager: maxNumBlocksToLog = 1000 2025-08-23 22:01:18,637 INFO org.apache.hadoop.hdfs.server.namenode.FSDirectory: GLOBAL serial map: bits=29 maxEntries=536870911 2025-08-23 22:01:18,638 INFO org.apache.hadoop.hdfs.server.namenode.FSDirectory: USER serial map: bits=24 maxEntries=16777215 2025-08-23 22:01:18,638 INFO org.apache.hadoop.hdfs.server.namenode.FSDirectory: GROUP serial map: bits=24 maxEntries=16777215 2025-08-23 22:01:18,638 INFO org.apache.hadoop.hdfs.server.namenode.FSDirectory: XATTR serial map: bits=24 maxEntries=16777215 2025-08-23 22:01:18,658 INFO org.apache.hadoop.util.GSet: Computing capacity for map INodeMap 2025-08-23 22:01:18,658 INFO org.apache.hadoop.util.GSet: VM type = 64-bit 2025-08-23 22:01:18,658 INFO org.apache.hadoop.util.GSet: 1.0% max memory 752 MB = 7.5 MB 2025-08-23 22:01:18,658 INFO org.apache.hadoop.util.GSet: capacity = 2^20 = 1048576 entries 2025-08-23 22:01:18,659 INFO org.apache.hadoop.hdfs.server.namenode.FSDirectory: ACLs enabled? true 2025-08-23 22:01:18,659 INFO org.apache.hadoop.hdfs.server.namenode.FSDirectory: POSIX ACL inheritance enabled? true 2025-08-23 22:01:18,659 INFO org.apache.hadoop.hdfs.server.namenode.FSDirectory: XAttrs enabled? true 2025-08-23 22:01:18,659 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: Caching file names occurring more than 10 times 2025-08-23 22:01:18,667 INFO org.apache.hadoop.hdfs.server.namenode.snapshot.SnapshotManager: Loaded config captureOpenFiles: false, skipCaptureAccessTimeOnlyChange: false, snapshotDiffAllowSnapRootDescendant: true, maxSnapshotLimit: 65536 2025-08-23 22:01:18,670 INFO org.apache.hadoop.hdfs.server.namenode.snapshot.SnapshotManager: SkipList is disabled 2025-08-23 22:01:18,676 INFO org.apache.hadoop.util.GSet: Computing capacity for map cachedBlocks 2025-08-23 22:01:18,676 INFO org.apache.hadoop.util.GSet: VM type = 64-bit 2025-08-23 22:01:18,676 INFO org.apache.hadoop.util.GSet: 0.25% max memory 752 MB = 1.9 MB 2025-08-23 22:01:18,676 INFO org.apache.hadoop.util.GSet: capacity = 2^18 = 262144 entries 2025-08-23 22:01:18,684 INFO org.apache.hadoop.hdfs.server.namenode.top.metrics.TopMetrics: NNTop conf: dfs.namenode.top.window.num.buckets = 10 2025-08-23 22:01:18,684 INFO org.apache.hadoop.hdfs.server.namenode.top.metrics.TopMetrics: NNTop conf: dfs.namenode.top.num.users = 10 2025-08-23 22:01:18,684 INFO org.apache.hadoop.hdfs.server.namenode.top.metrics.TopMetrics: NNTop conf: dfs.namenode.top.windows.minutes = 1,5,25 2025-08-23 22:01:18,687 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache on namenode is enabled 2025-08-23 22:01:18,687 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Retry cache will use 0.03 of total heap and retry cache entry expiry time is 600000 millis 2025-08-23 22:01:18,689 INFO org.apache.hadoop.util.GSet: Computing capacity for map NameNodeRetryCache 2025-08-23 22:01:18,689 INFO org.apache.hadoop.util.GSet: VM type = 64-bit 2025-08-23 22:01:18,689 INFO org.apache.hadoop.util.GSet: 0.029999999329447746% max memory 752 MB = 231.0 KB 2025-08-23 22:01:18,689 INFO org.apache.hadoop.util.GSet: capacity = 2^15 = 32768 entries 2025-08-23 22:01:18,721 INFO org.apache.hadoop.hdfs.server.common.Storage: Lock on /server/hadoop/data/nn/in_use.lock acquired by nodename 12582@master 2025-08-23 22:01:18,750 INFO org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Recovering unfinalized segments in /server/hadoop/data/nn/current 2025-08-23 22:01:18,800 INFO org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Finalizing edits file /server/hadoop/data/nn/current/edits_inprogress_0000000000000000003 -> /server/hadoop/data/nn/current/edits_0000000000000000003-0000000000000000003 2025-08-23 22:01:18,839 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Planning to load image: FSImageFile(file=/server/hadoop/data/nn/current/fsimage_0000000000000000000, cpktTxId=0000000000000000000) 2025-08-23 22:01:18,940 INFO org.apache.hadoop.hdfs.server.namenode.FSImageFormatPBINode: Loading 1 INodes. 2025-08-23 22:01:18,948 INFO org.apache.hadoop.hdfs.server.namenode.FSImageFormatPBINode: Successfully loaded 1 inodes 2025-08-23 22:01:18,960 INFO org.apache.hadoop.hdfs.server.namenode.FSImageFormatPBINode: Completed update blocks map and name cache, total waiting duration 0ms. 2025-08-23 22:01:18,968 INFO org.apache.hadoop.hdfs.server.namenode.FSImageFormatProtobuf: Loaded FSImage in 0 seconds. 2025-08-23 22:01:18,969 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Loaded image for txid 0 from /server/hadoop/data/nn/current/fsimage_0000000000000000000 2025-08-23 22:01:18,973 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Reading org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream@372b0d86 expecting start txid #1 2025-08-23 22:01:18,973 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Start loading edits file /server/hadoop/data/nn/current/edits_0000000000000000001-0000000000000000001 maxTxnsToRead = 9223372036854775807 2025-08-23 22:01:18,974 INFO org.apache.hadoop.hdfs.server.namenode.RedundantEditLogInputStream: Fast-forwarding stream '/server/hadoop/data/nn/current/edits_0000000000000000001-0000000000000000001' to transaction ID 1 2025-08-23 22:01:19,030 INFO org.apache.hadoop.hdfs.server.namenode.FSImage: Loaded 1 edits file(s) (the last named /server/hadoop/data/nn/current/edits_0000000000000000001-0000000000000000001) of total size 1048576.0, total edits 1.0, total load time 35.0 ms 2025-08-23 22:01:19,032 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Need to save fs image? false (staleImage=false, haEnabled=false, isRollingUpgrade=false) 2025-08-23 22:01:19,050 INFO org.apache.hadoop.hdfs.server.namenode.FSEditLog: Starting log segment at 4 2025-08-23 22:01:19,143 INFO org.apache.hadoop.hdfs.server.namenode.NameCache: initialized with 0 entries 0 lookups 2025-08-23 22:01:19,144 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Finished loading FSImage in 444 msecs 2025-08-23 22:01:19,375 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: RPC server is binding to hdfs:8020 2025-08-23 22:01:19,375 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: Enable NameNode state context:false 2025-08-23 22:01:19,384 INFO org.apache.hadoop.ipc.CallQueueManager: Using callQueue: class java.util.concurrent.LinkedBlockingQueue, queueCapacity: 10000, scheduler: class org.apache.hadoop.ipc.DefaultRpcScheduler, ipcBackoff: false. 2025-08-23 22:01:19,396 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Stopping services started for active state 2025-08-23 22:01:19,396 INFO org.apache.hadoop.hdfs.server.namenode.FSEditLog: Ending log segment 4, 4 2025-08-23 22:01:19,398 INFO org.apache.hadoop.hdfs.server.namenode.FSEditLog: Number of transactions: 2 Total time for transactions(ms): 1 Number of transactions batched in Syncs: 3 Number of syncs: 3 SyncTimes(ms): 7 2025-08-23 22:01:19,399 INFO org.apache.hadoop.hdfs.server.namenode.FileJournalManager: Finalizing edits file /server/hadoop/data/nn/current/edits_inprogress_0000000000000000004 -> /server/hadoop/data/nn/current/edits_0000000000000000004-0000000000000000005 2025-08-23 22:01:19,401 INFO org.apache.hadoop.hdfs.server.namenode.FSEditLog: FSEditLogAsync was interrupted, exiting 2025-08-23 22:01:19,418 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Stopping services started for active state 2025-08-23 22:01:19,419 INFO org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Stopping services started for standby state 2025-08-23 22:01:19,426 INFO org.eclipse.jetty.server.handler.ContextHandler: Stopped o.e.j.w.WebAppContext@2d6c53fc{hdfs,/,null,STOPPED}{file:/server/hadoop/share/hadoop/hdfs/webapps/hdfs} 2025-08-23 22:01:19,431 INFO org.eclipse.jetty.server.AbstractConnector: Stopped ServerConnector@21d03963{HTTP/1.1, (http/1.1)}{0.0.0.0:9870} 2025-08-23 22:01:19,431 INFO org.eclipse.jetty.server.session: node0 Stopped scavenging 2025-08-23 22:01:19,431 INFO org.eclipse.jetty.server.handler.ContextHandler: Stopped o.e.j.s.ServletContextHandler@7486b455{static,/static,file:///server/hadoop/share/hadoop/hdfs/webapps/static/,STOPPED} 2025-08-23 22:01:19,431 INFO org.eclipse.jetty.server.handler.ContextHandler: Stopped o.e.j.s.ServletContextHandler@465232e9{logs,/logs,file:///server/hadoop/logs/,STOPPED} 2025-08-23 22:01:19,462 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Stopping NameNode metrics system... 2025-08-23 22:01:19,463 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system stopped. 2025-08-23 22:01:19,464 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system shutdown complete. 2025-08-23 22:01:19,485 ERROR org.apache.hadoop.hdfs.server.namenode.NameNode: Failed to start namenode. java.net.SocketException: Call From hdfs to null:0 failed on socket exception: java.net.SocketException: Unresolved address; For more details see: http://wiki.apache.org/hadoop/SocketException at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:930) at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:888) at org.apache.hadoop.ipc.Server.bind(Server.java:676) at org.apache.hadoop.ipc.Server$Listener.<init>(Server.java:1284) at org.apache.hadoop.ipc.Server.<init>(Server.java:3211) at org.apache.hadoop.ipc.RPC$Server.<init>(RPC.java:1195) at org.apache.hadoop.ipc.ProtobufRpcEngine2$Server.<init>(ProtobufRpcEngine2.java:485) at org.apache.hadoop.ipc.ProtobufRpcEngine2.getServer(ProtobufRpcEngine2.java:387) at org.apache.hadoop.ipc.RPC$Builder.build(RPC.java:986) at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.<init>(NameNodeRpcServer.java:474) at org.apache.hadoop.hdfs.server.namenode.NameNode.createRpcServer(NameNode.java:878) at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:784) at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:1033) at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:1008) at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1782) at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1847) Caused by: java.net.SocketException: Unresolved address at sun.nio.ch.Net.translateToSocketException(Net.java:130) at sun.nio.ch.Net.translateException(Net.java:156) at sun.nio.ch.Net.translateException(Net.java:162) at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:76) at org.apache.hadoop.ipc.Server.bind(Server.java:659) ... 13 more Caused by: java.nio.channels.UnresolvedAddressException at sun.nio.ch.Net.checkAddress(Net.java:100) at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:220) at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74) ... 14 more 2025-08-23 22:01:19,488 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1: java.net.SocketException: Call From hdfs to null:0 failed on socket exception: java.net.SocketException: Unresolved address; For more details see: http://wiki.apache.org/hadoop/SocketException 2025-08-23 22:01:19,497 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: SHUTDOWN_MSG: /************************************************************ SHUTDOWN_MSG: Shutting down NameNode at master/192.168.88.8 ************************************************************/ [root@master hadoop]#
08-24
D:\Java\bin\java.exe "-javaagent:D:\Java\IntelliJ IDEA Community Edition 2023.1.7\lib\idea_rt.jar=58654:D:\Java\IntelliJ IDEA Community Edition 2023.1.7\bin" -Dfile.encoding=UTF-8 -classpath D:\hadoop_software\hdfscode\MRhbase\target\classes;D:\hadoop_software\Maven\repository\org\apache\hadoop\hadoop-client\3.1.3\hadoop-client-3.1.3.jar;D:\hadoop_software\Maven\repository\org\apache\hadoop\hadoop-common\3.1.3\hadoop-common-3.1.3.jar;D:\hadoop_software\Maven\repository\com\google\guava\guava\27.0-jre\guava-27.0-jre.jar;D:\hadoop_software\Maven\repository\com\google\guava\failureaccess\1.0\failureaccess-1.0.jar;D:\hadoop_software\Maven\repository\com\google\guava\listenablefuture\9999.0-empty-to-avoid-conflict-with-guava\listenablefuture-9999.0-empty-to-avoid-conflict-with-guava.jar;D:\hadoop_software\Maven\repository\org\checkerframework\checker-qual\2.5.2\checker-qual-2.5.2.jar;D:\hadoop_software\Maven\repository\com\google\j2objc\j2objc-annotations\1.1\j2objc-annotations-1.1.jar;D:\hadoop_software\Maven\repository\org\codehaus\mojo\animal-sniffer-annotations\1.17\animal-sniffer-annotations-1.17.jar;D:\hadoop_software\Maven\repository\commons-cli\commons-cli\1.2\commons-cli-1.2.jar;D:\hadoop_software\Maven\repository\org\apache\commons\commons-math3\3.1.1\commons-math3-3.1.1.jar;D:\hadoop_software\Maven\repository\org\apache\httpcomponents\httpclient\4.5.2\httpclient-4.5.2.jar;D:\hadoop_software\Maven\repository\org\apache\httpcomponents\httpcore\4.4.4\httpcore-4.4.4.jar;D:\hadoop_software\Maven\repository\commons-net\commons-net\3.6\commons-net-3.6.jar;D:\hadoop_software\Maven\repository\commons-collections\commons-collections\3.2.2\commons-collections-3.2.2.jar;D:\hadoop_software\Maven\repository\org\eclipse\jetty\jetty-servlet\9.3.24.v20180605\jetty-servlet-9.3.24.v20180605.jar;D:\hadoop_software\Maven\repository\org\eclipse\jetty\jetty-security\9.3.24.v20180605\jetty-security-9.3.24.v20180605.jar;D:\hadoop_software\Maven\repository\org\eclipse\jetty\jetty-webapp\9.3.24.v20180605\jetty-webapp-9.3.24.v20180605.jar;D:\hadoop_software\Maven\repository\org\eclipse\jetty\jetty-xml\9.3.24.v20180605\jetty-xml-9.3.24.v20180605.jar;D:\hadoop_software\Maven\repository\javax\servlet\jsp\jsp-api\2.1\jsp-api-2.1.jar;D:\hadoop_software\Maven\repository\com\sun\jersey\jersey-servlet\1.19\jersey-servlet-1.19.jar;D:\hadoop_software\Maven\repository\commons-logging\commons-logging\1.1.3\commons-logging-1.1.3.jar;D:\hadoop_software\Maven\repository\commons-lang\commons-lang\2.6\commons-lang-2.6.jar;D:\hadoop_software\Maven\repository\commons-beanutils\commons-beanutils\1.9.3\commons-beanutils-1.9.3.jar;D:\hadoop_software\Maven\repository\org\apache\commons\commons-configuration2\2.1.1\commons-configuration2-2.1.1.jar;D:\hadoop_software\Maven\repository\org\apache\avro\avro\1.7.7\avro-1.7.7.jar;D:\hadoop_software\Maven\repository\org\codehaus\jackson\jackson-core-asl\1.9.13\jackson-core-asl-1.9.13.jar;D:\hadoop_software\Maven\repository\org\codehaus\jackson\jackson-mapper-asl\1.9.13\jackson-mapper-asl-1.9.13.jar;D:\hadoop_software\Maven\repository\com\thoughtworks\paranamer\paranamer\2.3\paranamer-2.3.jar;D:\hadoop_software\Maven\repository\org\xerial\snappy\snappy-java\1.0.5\snappy-java-1.0.5.jar;D:\hadoop_software\Maven\repository\com\google\re2j\re2j\1.1\re2j-1.1.jar;D:\hadoop_software\Maven\repository\com\google\code\gson\gson\2.2.4\gson-2.2.4.jar;D:\hadoop_software\Maven\repository\org\apache\curator\curator-client\2.13.0\curator-client-2.13.0.jar;D:\hadoop_software\Maven\repository\org\apache\curator\curator-recipes\2.13.0\curator-recipes-2.13.0.jar;D:\hadoop_software\Maven\repository\com\google\code\findbugs\jsr305\3.0.0\jsr305-3.0.0.jar;D:\hadoop_software\Maven\repository\org\apache\commons\commons-compress\1.18\commons-compress-1.18.jar;D:\hadoop_software\Maven\repository\org\apache\kerby\kerb-simplekdc\1.0.1\kerb-simplekdc-1.0.1.jar;D:\hadoop_software\Maven\repository\org\apache\kerby\kerb-client\1.0.1\kerb-client-1.0.1.jar;D:\hadoop_software\Maven\repository\org\apache\kerby\kerby-config\1.0.1\kerby-config-1.0.1.jar;D:\hadoop_software\Maven\repository\org\apache\kerby\kerb-core\1.0.1\kerb-core-1.0.1.jar;D:\hadoop_software\Maven\repository\org\apache\kerby\kerby-pkix\1.0.1\kerby-pkix-1.0.1.jar;D:\hadoop_software\Maven\repository\org\apache\kerby\kerby-asn1\1.0.1\kerby-asn1-1.0.1.jar;D:\hadoop_software\Maven\repository\org\apache\kerby\kerby-util\1.0.1\kerby-util-1.0.1.jar;D:\hadoop_software\Maven\repository\org\apache\kerby\kerb-common\1.0.1\kerb-common-1.0.1.jar;D:\hadoop_software\Maven\repository\org\apache\kerby\kerb-crypto\1.0.1\kerb-crypto-1.0.1.jar;D:\hadoop_software\Maven\repository\org\apache\kerby\kerb-util\1.0.1\kerb-util-1.0.1.jar;D:\hadoop_software\Maven\repository\org\apache\kerby\token-provider\1.0.1\token-provider-1.0.1.jar;D:\hadoop_software\Maven\repository\org\apache\kerby\kerb-admin\1.0.1\kerb-admin-1.0.1.jar;D:\hadoop_software\Maven\repository\org\apache\kerby\kerb-server\1.0.1\kerb-server-1.0.1.jar;D:\hadoop_software\Maven\repository\org\apache\kerby\kerb-identity\1.0.1\kerb-identity-1.0.1.jar;D:\hadoop_software\Maven\repository\org\apache\kerby\kerby-xdr\1.0.1\kerby-xdr-1.0.1.jar;D:\hadoop_software\Maven\repository\com\fasterxml\jackson\core\jackson-databind\2.7.8\jackson-databind-2.7.8.jar;D:\hadoop_software\Maven\repository\com\fasterxml\jackson\core\jackson-core\2.7.8\jackson-core-2.7.8.jar;D:\hadoop_software\Maven\repository\org\codehaus\woodstox\stax2-api\3.1.4\stax2-api-3.1.4.jar;D:\hadoop_software\Maven\repository\com\fasterxml\woodstox\woodstox-core\5.0.3\woodstox-core-5.0.3.jar;D:\hadoop_software\Maven\repository\org\apache\hadoop\hadoop-hdfs-client\3.1.3\hadoop-hdfs-client-3.1.3.jar;D:\hadoop_software\Maven\repository\com\squareup\okhttp\okhttp\2.7.5\okhttp-2.7.5.jar;D:\hadoop_software\Maven\repository\com\squareup\okio\okio\1.6.0\okio-1.6.0.jar;D:\hadoop_software\Maven\repository\com\fasterxml\jackson\core\jackson-annotations\2.7.8\jackson-annotations-2.7.8.jar;D:\hadoop_software\Maven\repository\org\apache\hadoop\hadoop-yarn-api\3.1.3\hadoop-yarn-api-3.1.3.jar;D:\hadoop_software\Maven\repository\javax\xml\bind\jaxb-api\2.2.11\jaxb-api-2.2.11.jar;D:\hadoop_software\Maven\repository\org\apache\hadoop\hadoop-yarn-client\3.1.3\hadoop-yarn-client-3.1.3.jar;D:\hadoop_software\Maven\repository\org\apache\hadoop\hadoop-mapreduce-client-core\3.1.3\hadoop-mapreduce-client-core-3.1.3.jar;D:\hadoop_software\Maven\repository\org\apache\hadoop\hadoop-yarn-common\3.1.3\hadoop-yarn-common-3.1.3.jar;D:\hadoop_software\Maven\repository\org\eclipse\jetty\jetty-util\9.3.24.v20180605\jetty-util-9.3.24.v20180605.jar;D:\hadoop_software\Maven\repository\com\sun\jersey\jersey-core\1.19\jersey-core-1.19.jar;D:\hadoop_software\Maven\repository\javax\ws\rs\jsr311-api\1.1.1\jsr311-api-1.1.1.jar;D:\hadoop_software\Maven\repository\com\sun\jersey\jersey-client\1.19\jersey-client-1.19.jar;D:\hadoop_software\Maven\repository\com\fasterxml\jackson\module\jackson-module-jaxb-annotations\2.7.8\jackson-module-jaxb-annotations-2.7.8.jar;D:\hadoop_software\Maven\repository\com\fasterxml\jackson\jaxrs\jackson-jaxrs-json-provider\2.7.8\jackson-jaxrs-json-provider-2.7.8.jar;D:\hadoop_software\Maven\repository\com\fasterxml\jackson\jaxrs\jackson-jaxrs-base\2.7.8\jackson-jaxrs-base-2.7.8.jar;D:\hadoop_software\Maven\repository\org\apache\hadoop\hadoop-mapreduce-client-jobclient\3.1.3\hadoop-mapreduce-client-jobclient-3.1.3.jar;D:\hadoop_software\Maven\repository\org\apache\hadoop\hadoop-mapreduce-client-common\3.1.3\hadoop-mapreduce-client-common-3.1.3.jar;D:\hadoop_software\Maven\repository\org\apache\hadoop\hadoop-annotations\3.1.3\hadoop-annotations-3.1.3.jar;D:\hadoop_software\Maven\repository\junit\junit\4.12\junit-4.12.jar;D:\hadoop_software\Maven\repository\org\hamcrest\hamcrest-core\1.3\hamcrest-core-1.3.jar;D:\hadoop_software\Maven\repository\org\apache\hbase\hbase-client\2.4.11\hbase-client-2.4.11.jar;D:\hadoop_software\Maven\repository\org\apache\hbase\thirdparty\hbase-shaded-protobuf\3.5.1\hbase-shaded-protobuf-3.5.1.jar;D:\hadoop_software\Maven\repository\org\apache\hbase\hbase-hadoop-compat\2.4.11\hbase-hadoop-compat-2.4.11.jar;D:\hadoop_software\Maven\repository\org\apache\hbase\hbase-hadoop2-compat\2.4.11\hbase-hadoop2-compat-2.4.11.jar;D:\hadoop_software\Maven\repository\javax\activation\javax.activation-api\1.2.0\javax.activation-api-1.2.0.jar;D:\hadoop_software\Maven\repository\org\apache\hbase\hbase-protocol-shaded\2.4.11\hbase-protocol-shaded-2.4.11.jar;D:\hadoop_software\Maven\repository\org\apache\hbase\hbase-protocol\2.4.11\hbase-protocol-2.4.11.jar;D:\hadoop_software\Maven\repository\javax\annotation\javax.annotation-api\1.2\javax.annotation-api-1.2.jar;D:\hadoop_software\Maven\repository\commons-codec\commons-codec\1.13\commons-codec-1.13.jar;D:\hadoop_software\Maven\repository\commons-io\commons-io\2.11.0\commons-io-2.11.0.jar;D:\hadoop_software\Maven\repository\org\apache\commons\commons-lang3\3.9\commons-lang3-3.9.jar;D:\hadoop_software\Maven\repository\org\slf4j\slf4j-api\1.7.33\slf4j-api-1.7.33.jar;D:\hadoop_software\Maven\repository\org\apache\hbase\thirdparty\hbase-shaded-miscellaneous\3.5.1\hbase-shaded-miscellaneous-3.5.1.jar;D:\hadoop_software\Maven\repository\com\google\errorprone\error_prone_annotations\2.7.1\error_prone_annotations-2.7.1.jar;D:\hadoop_software\Maven\repository\com\google\protobuf\protobuf-java\2.5.0\protobuf-java-2.5.0.jar;D:\hadoop_software\Maven\repository\org\apache\hbase\thirdparty\hbase-shaded-netty\3.5.1\hbase-shaded-netty-3.5.1.jar;D:\hadoop_software\Maven\repository\org\apache\zookeeper\zookeeper\3.5.7\zookeeper-3.5.7.jar;D:\hadoop_software\Maven\repository\org\apache\zookeeper\zookeeper-jute\3.5.7\zookeeper-jute-3.5.7.jar;D:\hadoop_software\Maven\repository\io\netty\netty-handler\4.1.45.Final\netty-handler-4.1.45.Final.jar;D:\hadoop_software\Maven\repository\io\netty\netty-common\4.1.45.Final\netty-common-4.1.45.Final.jar;D:\hadoop_software\Maven\repository\io\netty\netty-buffer\4.1.45.Final\netty-buffer-4.1.45.Final.jar;D:\hadoop_software\Maven\repository\io\netty\netty-transport\4.1.45.Final\netty-transport-4.1.45.Final.jar;D:\hadoop_software\Maven\repository\io\netty\netty-resolver\4.1.45.Final\netty-resolver-4.1.45.Final.jar;D:\hadoop_software\Maven\repository\io\netty\netty-codec\4.1.45.Final\netty-codec-4.1.45.Final.jar;D:\hadoop_software\Maven\repository\io\netty\netty-transport-native-epoll\4.1.45.Final\netty-transport-native-epoll-4.1.45.Final.jar;D:\hadoop_software\Maven\repository\io\netty\netty-transport-native-unix-common\4.1.45.Final\netty-transport-native-unix-common-4.1.45.Final.jar;D:\hadoop_software\Maven\repository\org\apache\htrace\htrace-core4\4.2.0-incubating\htrace-core4-4.2.0-incubating.jar;D:\hadoop_software\Maven\repository\org\jruby\jcodings\jcodings\1.0.55\jcodings-1.0.55.jar;D:\hadoop_software\Maven\repository\org\jruby\joni\joni\2.1.31\joni-2.1.31.jar;D:\hadoop_software\Maven\repository\io\dropwizard\metrics\metrics-core\3.2.6\metrics-core-3.2.6.jar;D:\hadoop_software\Maven\repository\org\apache\commons\commons-crypto\1.0.0\commons-crypto-1.0.0.jar;D:\hadoop_software\Maven\repository\org\apache\hadoop\hadoop-auth\2.10.0\hadoop-auth-2.10.0.jar;D:\hadoop_software\Maven\repository\com\nimbusds\nimbus-jose-jwt\4.41.1\nimbus-jose-jwt-4.41.1.jar;D:\hadoop_software\Maven\repository\com\github\stephenc\jcip\jcip-annotations\1.0-1\jcip-annotations-1.0-1.jar;D:\hadoop_software\Maven\repository\org\apache\directory\server\apacheds-kerberos-codec\2.0.0-M15\apacheds-kerberos-codec-2.0.0-M15.jar;D:\hadoop_software\Maven\repository\org\apache\directory\server\apacheds-i18n\2.0.0-M15\apacheds-i18n-2.0.0-M15.jar;D:\hadoop_software\Maven\repository\org\apache\directory\api\api-asn1-api\1.0.0-M20\api-asn1-api-1.0.0-M20.jar;D:\hadoop_software\Maven\repository\org\apache\directory\api\api-util\1.0.0-M20\api-util-1.0.0-M20.jar;D:\hadoop_software\Maven\repository\org\apache\curator\curator-framework\2.7.1\curator-framework-2.7.1.jar;D:\hadoop_software\Maven\repository\org\apache\yetus\audience-annotations\0.5.0\audience-annotations-0.5.0.jar;D:\hadoop_software\Maven\repository\org\apache\hbase\hbase-common\2.4.11\hbase-common-2.4.11.jar;D:\hadoop_software\Maven\repository\org\apache\hbase\hbase-logging\2.4.11\hbase-logging-2.4.11.jar;D:\hadoop_software\Maven\repository\org\apache\hbase\thirdparty\hbase-shaded-gson\3.5.1\hbase-shaded-gson-3.5.1.jar;D:\hadoop_software\Maven\repository\org\apache\hbase\hbase-mapreduce\2.4.11\hbase-mapreduce-2.4.11.jar;D:\hadoop_software\Maven\repository\org\apache\hbase\hbase-zookeeper\2.4.11\hbase-zookeeper-2.4.11.jar;D:\hadoop_software\Maven\repository\org\apache\hbase\hbase-metrics\2.4.11\hbase-metrics-2.4.11.jar;D:\hadoop_software\Maven\repository\org\apache\hbase\hbase-metrics-api\2.4.11\hbase-metrics-api-2.4.11.jar;D:\hadoop_software\Maven\repository\org\apache\hbase\hbase-asyncfs\2.4.11\hbase-asyncfs-2.4.11.jar;D:\hadoop_software\Maven\repository\org\apache\hbase\hbase-server\2.4.11\hbase-server-2.4.11.jar;D:\hadoop_software\Maven\repository\org\apache\hbase\hbase-http\2.4.11\hbase-http-2.4.11.jar;D:\hadoop_software\Maven\repository\org\apache\hbase\thirdparty\hbase-shaded-jetty\3.5.1\hbase-shaded-jetty-3.5.1.jar;D:\hadoop_software\Maven\repository\org\apache\hbase\thirdparty\hbase-shaded-jersey\3.5.1\hbase-shaded-jersey-3.5.1.jar;D:\hadoop_software\Maven\repository\jakarta\ws\rs\jakarta.ws.rs-api\2.1.6\jakarta.ws.rs-api-2.1.6.jar;D:\hadoop_software\Maven\repository\jakarta\annotation\jakarta.annotation-api\1.3.5\jakarta.annotation-api-1.3.5.jar;D:\hadoop_software\Maven\repository\jakarta\validation\jakarta.validation-api\2.0.2\jakarta.validation-api-2.0.2.jar;D:\hadoop_software\Maven\repository\org\glassfish\hk2\external\jakarta.inject\2.6.1\jakarta.inject-2.6.1.jar;D:\hadoop_software\Maven\repository\org\javassist\javassist\3.25.0-GA\javassist-3.25.0-GA.jar;D:\hadoop_software\Maven\repository\javax\ws\rs\javax.ws.rs-api\2.1.1\javax.ws.rs-api-2.1.1.jar;D:\hadoop_software\Maven\repository\org\apache\hbase\hbase-procedure\2.4.11\hbase-procedure-2.4.11.jar;D:\hadoop_software\Maven\repository\org\glassfish\web\javax.servlet.jsp\2.3.2\javax.servlet.jsp-2.3.2.jar;D:\hadoop_software\Maven\repository\org\glassfish\javax.el\3.0.1-b12\javax.el-3.0.1-b12.jar;D:\hadoop_software\Maven\repository\javax\servlet\jsp\javax.servlet.jsp-api\2.3.1\javax.servlet.jsp-api-2.3.1.jar;D:\hadoop_software\Maven\repository\com\github\ben-manes\caffeine\caffeine\2.8.1\caffeine-2.8.1.jar;D:\hadoop_software\Maven\repository\org\jamon\jamon-runtime\2.4.1\jamon-runtime-2.4.1.jar;D:\hadoop_software\Maven\repository\javax\servlet\javax.servlet-api\3.1.0\javax.servlet-api-3.1.0.jar;D:\hadoop_software\Maven\repository\com\lmax\disruptor\3.4.2\disruptor-3.4.2.jar;D:\hadoop_software\Maven\repository\org\apache\hadoop\hadoop-distcp\2.10.0\hadoop-distcp-2.10.0.jar;D:\hadoop_software\Maven\repository\org\apache\hbase\hbase-replication\2.4.11\hbase-replication-2.4.11.jar;D:\hadoop_software\Maven\repository\org\apache\hadoop\hadoop-hdfs\2.10.0\hadoop-hdfs-2.10.0.jar;D:\hadoop_software\Maven\repository\org\mortbay\jetty\jetty\6.1.26\jetty-6.1.26.jar;D:\hadoop_software\Maven\repository\org\mortbay\jetty\jetty-util\6.1.26\jetty-util-6.1.26.jar;D:\hadoop_software\Maven\repository\com\sun\jersey\jersey-server\1.9\jersey-server-1.9.jar;D:\hadoop_software\Maven\repository\asm\asm\3.1\asm-3.1.jar;D:\hadoop_software\Maven\repository\commons-daemon\commons-daemon\1.0.13\commons-daemon-1.0.13.jar;D:\hadoop_software\Maven\repository\xmlenc\xmlenc\0.52\xmlenc-0.52.jar;D:\hadoop_software\Maven\repository\org\fusesource\leveldbjni\leveldbjni-all\1.8\leveldbjni-all-1.8.jar;D:\hadoop_software\Maven\repository\org\slf4j\slf4j-log4j12\1.7.30\slf4j-log4j12-1.7.30.jar;D:\hadoop_software\Maven\repository\log4j\log4j\1.2.17\log4j-1.2.17.jar com.bigdata.hbase.FruitDriver Exception in thread "main" java.lang.NoSuchMethodError: 'void org.apache.hadoop.security.HadoopKerberosName.setRuleMechanism(java.lang.String)' at org.apache.hadoop.security.HadoopKerberosName.setConfiguration(HadoopKerberosName.java:84) at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:318) at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:303) at org.apache.hadoop.security.UserGroupInformation.doSubjectLogin(UserGroupInformation.java:1827) at org.apache.hadoop.security.UserGroupInformation.createLoginUser(UserGroupInformation.java:709) at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:659) at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:570) at org.apache.hadoop.mapreduce.task.JobContextImpl.<init>(JobContextImpl.java:72) at org.apache.hadoop.mapreduce.Job.<init>(Job.java:150) at org.apache.hadoop.mapreduce.Job.getInstance(Job.java:193) at com.bigdata.hbase.FruitDriver.run(FruitDriver.java:19) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76) at com.bigdata.hbase.FruitDriver.main(FruitDriver.java:43)
06-17
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值