Hadoop记录-HDFS配额Quota

本文介绍如何使用HDFS命令行工具设置和管理文件数配额、空间配额,包括设置配额、清除配额及查看目录占用情况。通过示例展示了具体的命令用法。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

 设置文件数配额

hdfs dfsadmin -setQuota <N> <directory>...<directory>

例如:设置目录下的文件总数为1000个hdfs dfsadmin -setQuota 1000 /p/work

清除配额

hdfs dfsadmin -clrQuota <directory>...<directory>

设置空间配额

hdfs dfsadmin -setSpaceQuota <N> <directory>...<directory>

例如:hdfs dfsadmin -setSpaceQuota 9T /p/work

清除目录大小配额

dfsadmin -clrSpaceQuota <directory>...<director>

查看目录占用情况

字段说明

 

字段说明
QUOTA目录数+文件数的限制
REM_QUOTA可用的目录数+文件数
SPACE_QUOTA字节数限制
REM_SPACE_QUOTA可用字节数
DIR_COUNT目录数
FILE_COUNT文件数
CONTENT_SIZE当前文件大小
PATHNAMEHDFS路径
#!/bin/sh

export HADOOP_CONF_DIR=/home/hdfs/balancer/hadoop-conf
linesum=$(cat userquota.txt | wc -l)
let linesum=linesum+1
if [ $linesum -gt 0 ]
   then
   for (( i = 1 ; i < $linesum ; i++ ))
   do  
       size=$(sed -n "$i p" userquota.txt | awk '{print $4*3}')T
	   dir=$(sed -n "$i p" userquota.txt| awk '{print $1}') 
       hdfs dfsadmin -setSpaceQuota  $size  $dir
   done
else
   echo "txt is nothing"
fi

转载于:https://www.cnblogs.com/xinfang520/p/10442607.html

"C:\Program Files\Java\jdk-17\bin\java.exe" -Didea.launcher.port=54222 "-Didea.launcher.bin.path=D:\hadoop\IntelliJ IDEA Community Edition 2018.3.6\bin" -Dfile.encoding=UTF-8 -classpath "D:\hadoop\Hadoop\target\classes;D:\hadoop\hadoop-3.1.4\share\hadoop\client\hadoop-client-api-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\client\hadoop-client-runtime-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\client\hadoop-client-minicluster-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\common\hadoop-kms-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\common\hadoop-nfs-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\common\hadoop-common-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\common\hadoop-common-3.1.4-tests.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\hdfs\hadoop-hdfs-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\hdfs\hadoop-hdfs-nfs-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\hdfs\hadoop-hdfs-rbf-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\hdfs\hadoop-hdfs-3.1.4-tests.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\hdfs\hadoop-hdfs-client-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\hdfs\hadoop-hdfs-httpfs-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\hdfs\hadoop-hdfs-rbf-3.1.4-tests.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\hdfs\hadoop-hdfs-client-3.1.4-tests.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\hdfs\hadoop-hdfs-native-client-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\hdfs\hadoop-hdfs-native-client-3.1.4-tests.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\mapreduce\hadoop-mapreduce-examples-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\mapreduce\hadoop-mapreduce-client-hs-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\mapreduce\hadoop-mapreduce-client-app-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\mapreduce\hadoop-mapreduce-client-core-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\mapreduce\hadoop-mapreduce-client-common-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\mapreduce\hadoop-mapreduce-client-shuffle-3.1.4.jar;D:\hadoop\hadoop-3.1.4\share\hadoop\mapreduce\hadoop-mapreduce-c
最新发布
05-07
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值