Hadoop HDFS操作命令总结

Hadoop HDFS操作命令总结

 

1.列出根目录下所有的目录或文件

hadoop fs -ls /

 

2.列出/logs目录下的所有目录和文件

hadoop fs -ls /logs

 

3.列出/user目录及其子目录下的所有文件(谨慎使用)

hadoop fs -ls -R /user

 

4.创建/soft目录

hadoop fs -mkdir /soft

 

5.创建多级目录

hadoop fs -mkdir -p /apps/windows/2017/01/01

 

6.将本地的wordcount.jar文件上传到/wordcount目录下

hadoop fs -put wordcount.jar /wordcount

 

7.下载words.txt文件到本地

hadoop fs -get /words.txt

 

 

8.将/stu/students.txt文件拷贝到本地

hadoop fs -copyToLocal /stu/students.txt

 

9.将word.txt文件拷贝到/wordcount/input/目录

hadoop fs -copyFromLocal word.txt /wordcount/input

 

10.将word.txt文件从本地移动到/wordcount/input/目录下

hadoop fs -moveFromLocal word.txt /wordcount/input/

 

11.将/stu/students.txt拷贝一份为/stu/students.txt.bak

hadoop fs -cp /stu/students.txt /stu/students.txt.bak

 

12.将/flume/tailout/目录下的子目录或文件都拷贝到/logs目录(如果此目录不存在会创建)下

hadoop fs -cp /flume/tailout/ /logs

 

13.将/word.txt文件重命名为/words.txt

hadoop fs -mv /word.txt /words.txt

 

14.将/words.txt文件移动到/wordcount/input/目录下

hadoop fs -mv /words.txt /wordcount/input/

 

15.将/ws目录以及子目录和文件都删除(谨慎使用)

hadoop fs -rm -r /ws

 

16.删除以"xbs-"开头的目录及其子目录

hadoop fs -rm -r /xbs-*

 

 

17.将/wordcount/output2/目录下的a.txt文件删除

hadoop fs -rm /wordcount/output2/a.txt

 

18.将/wordcount/input/目录下面的所有文件都删除

hadoop fs -rm /wordcount/input/*

 

19.查看HDFS集群的磁盘空间使用情况

hadoop fs -df -h

 

20.查看/word.txt文件的内容

hadoop fs -cat /word.txt

 

21.将name.txt文件中的内容添加到/wordcount/input/words.txt文件中

hadoop fs -appendToFile name.txt /wordcount/input/words.txt

 

22.动态查看/wordcount/input/words.txt文件的内容

hadoop fs -tail -f /wordcount/input/words.txt

 

23.统计/flume目录总大小

hadoop fs -du -s -h /flume

 

24.分别统计/flume目录下各个子目录(或文件)大小

hadoop fs -du -s -h /flume/*

 

25.运行jar包中的程序

//hadoop jar + 要执行的jar包 + 要运行的类 + 输入目录 + 输出目录
hadoop jar wordcount.jar com.xuebusi.hadoop.mr.WordCountDriver /wordcount/input /wordcount/out

 

26.查看hdfs集群状态

hdfs dfsadmin -report
[root@hadoop03 apps]# hdfs dfsadmin -report
Configured Capacity: 55737004032 (51.91 GB)
Present Capacity: 15066578944 (14.03 GB)
DFS Remaining: 14682021888 (13.67 GB)
DFS Used: 384557056 (366.74 MB)
DFS Used%: 2.55%
Under replicated blocks: 7
Blocks with corrupt replicas: 0
Missing blocks: 0

-------------------------------------------------
Live datanodes (3):

Name: 192.168.71.11:50010 (hadoop01)
Hostname: hadoop01
Decommission Status : Normal
Configured Capacity: 18579001344 (17.30 GB)
DFS Used: 128180224 (122.24 MB)
Non DFS Used: 16187543552 (15.08 GB)
DFS Remaining: 2263277568 (2.11 GB)
DFS Used%: 0.69%
DFS Remaining%: 12.18%
Configured Cache Capacity: 0 (0 B)
Cache Used: 0 (0 B)
Cache Remaining: 0 (0 B)
Cache Used%: 100.00%
Cache Remaining%: 0.00%
Xceivers: 1
Last contact: Mon Jan 09 11:17:43 PST 2017


Name: 192.168.71.13:50010 (hadoop03)
Hostname: hadoop03
Decommission Status : Normal
Configured Capacity: 18579001344 (17.30 GB)
DFS Used: 128196608 (122.26 MB)
Non DFS Used: 13623074816 (12.69 GB)
DFS Remaining: 4827729920 (4.50 GB)
DFS Used%: 0.69%
DFS Remaining%: 25.98%
Configured Cache Capacity: 0 (0 B)
Cache Used: 0 (0 B)
Cache Remaining: 0 (0 B)
Cache Used%: 100.00%
Cache Remaining%: 0.00%
Xceivers: 1
Last contact: Mon Jan 09 11:17:41 PST 2017


Name: 192.168.71.12:50010 (hadoop02)
Hostname: hadoop02
Decommission Status : Normal
Configured Capacity: 18579001344 (17.30 GB)
DFS Used: 128180224 (122.24 MB)
Non DFS Used: 10859806720 (10.11 GB)
DFS Remaining: 7591014400 (7.07 GB)
DFS Used%: 0.69%
DFS Remaining%: 40.86%
Configured Cache Capacity: 0 (0 B)
Cache Used: 0 (0 B)
Cache Remaining: 0 (0 B)
Cache Used%: 100.00%
Cache Remaining%: 0.00%
Xceivers: 1
Last contact: Mon Jan 09 11:17:42 PST 2017
View Code

 

27.查看hadoop fs命令使用帮助

[root@hadoop01 hadoop]# hadoop fs
Usage: hadoop fs [generic options]
        [-appendToFile <localsrc> ... <dst>]
        [-cat [-ignoreCrc] <src> ...]
        [-checksum <src> ...]
        [-chgrp [-R] GROUP PATH...]
        [-chmod [-R] <MODE[,MODE]... | OCTALMODE> PATH...]
        [-chown [-R] [OWNER][:[GROUP]] PATH...]
        [-copyFromLocal [-f] [-p] [-l] <localsrc> ... <dst>]
        [-copyToLocal [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
        [-count [-q] [-h] <path> ...]
        [-cp [-f] [-p | -p[topax]] <src> ... <dst>]
        [-createSnapshot <snapshotDir> [<snapshotName>]]
        [-deleteSnapshot <snapshotDir> <snapshotName>]
        [-df [-h] [<path> ...]]
        [-du [-s] [-h] <path> ...]
        [-expunge]
        [-get [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
        [-getfacl [-R] <path>]
        [-getfattr [-R] {-n name | -d} [-e en] <path>]
        [-getmerge [-nl] <src> <localdst>]
        [-help [cmd ...]]
        [-ls [-d] [-h] [-R] [<path> ...]]
        [-mkdir [-p] <path> ...]
        [-moveFromLocal <localsrc> ... <dst>]
        [-moveToLocal <src> <localdst>]
        [-mv <src> ... <dst>]
        [-put [-f] [-p] [-l] <localsrc> ... <dst>]
        [-renameSnapshot <snapshotDir> <oldName> <newName>]
        [-rm [-f] [-r|-R] [-skipTrash] <src> ...]
        [-rmdir [--ignore-fail-on-non-empty] <dir> ...]
        [-setfacl [-R] [{-b|-k} {-m|-x <acl_spec>} <path>]|[--set <acl_spec> <path>]]
        [-setfattr {-n name [-v value] | -x name} <path>]
        [-setrep [-R] [-w] <rep> <path> ...]
        [-stat [format] <path> ...]
        [-tail [-f] <file>]
        [-test -[defsz] <path>]
        [-text [-ignoreCrc] <src> ...]
        [-touchz <path> ...]
        [-usage [cmd ...]]
View Code

 

转载于:https://www.cnblogs.com/jun1019/p/6263282.html

评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值