hdfs 和hive与 partition


home/wangshumin
[wangshumin@centoshostnameKL2 ~]$ hdfs dfs -cat /user/hive/warehouse/hive2_db1.db/stuin
cat: `/user/hive/warehouse/hive2_db1.db/stuin': Is a directory
[wangshumin@centoshostnameKL2 ~]$ hdfs dfs -rm r /user/hive/warehouse/hive2_db1.db/stuin
rm: `r': No such file or directory
rm: `/user/hive/warehouse/hive2_db1.db/stuin': Is a directory
[wangshumin@centoshostnameKL2 ~]$ hdfs dfs -rm /user/hive/warehouse/hive2_db1.db/stuout
rm: `/user/hive/warehouse/hive2_db1.db/stuout': Is a directory
[wangshumin@centoshostnameKL2 ~]$ hdfs dfs 
Usage: hadoop fs [generic options]
[-appendToFile <localsrc> ... <dst>]
[-cat [-ignoreCrc] <src> ...]
[-checksum <src> ...]
[-chgrp [-R] GROUP PATH...]
[-chmod [-R] <MODE[,MODE]... | OCTALMODE> PATH...]
[-chown [-R] [OWNER][:[GROUP]] PATH...]
[-copyFromLocal [-f] [-p] <localsrc> ... <dst>]
[-copyToLocal [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
[-count [-q] <path> ...]
[-cp [-f] [-p] <src> ... <dst>]
[-createSnapshot <snapshotDir> [<snapshotName>]]
[-deleteSnapshot <snapshotDir> <snapshotName>]
[-df [-h] [<path> ...]]
[-du [-s] [-h] <path> ...]
[-expunge]
[-get [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
[-getfacl [-R] <path>]
[-getmerge [-nl] <src> <localdst>]
[-help [cmd ...]]
[-ls [-d] [-h] [-R] [<path> ...]]
[-mkdir [-p] <path> ...]
[-moveFromLocal <localsrc> ... <dst>]
[-moveToLocal <src> <localdst>]
[-mv <src> ... <dst>]
[-put [-f] [-p] <localsrc> ... <dst>]
[-renameSnapshot <snapshotDir> <oldName> <newName>]
[-rm [-f] [-r|-R] [-skipTrash] <src> ...]
[-rmdir [--ignore-fail-on-non-empty] <dir> ...]
[-setfacl [-R] [{-b|-k} {-m|-x <acl_spec>} <path>]|[--set <acl_spec> <path>]]
[-setrep [-R] [-w] <rep> <path> ...]
[-stat [format] <path> ...]
[-tail [-f] <file>]
[-test -[defsz] <path>]
[-text [-ignoreCrc] <src> ...]
[-touchz <path> ...]
[-usage [cmd ...]]


Generic options supported are
-conf <configuration file>     specify an application configuration file
-D <property=value>            use value for given property
-fs <local|namenode:port>      specify a namenode
-jt <local|jobtracker:port>    specify a job tracker
-files <comma separated list of files>    specify comma separated files to be copied to the map reduce cluster
-libjars <comma separated list of jars>    specify comma separated jar files to include in the classpath.
-archives <comma separated list of archives>    specify comma separated archives to be unarchived on the compute machines.


The general command line syntax is
bin/hadoop command [genericOptions] [commandOptions]


[wangshumin@centoshostnameKL2 ~]$ hdfs dfs -rmdir /user/hive/warehouse/hive2_db1.db/stuout
[wangshumin@centoshostnameKL2 ~]$ hdfs dfs -ls /
Found 7 items
drwxr-xr-x   - wangshumin supergroup          0 2018-02-09 07:32 /data
drwxr-xr-x   - wangshumin supergroup          0 2018-02-09 07:41 /dataload_balance
drwxr-xr-x   - wangshumin supergroup          0 2018-02-09 07:18 /flumedata2
drwxr-xr-x   - wangshumin supergroup          0 2018-02-26 15:35 /hbase
drwxr-xr-x   - wangshumin supergroup          0 2018-03-12 13:16 /hivedata
drwx-wx-wx   - wangshumin supergroup          0 2018-03-12 13:04 /tmp
drwxr-xr-x   - wangshumin supergroup          0 2018-02-09 06:50 /user
[wangshumin@centoshostnameKL2 ~]$ hdfs dfs -rmdir /hivedata
[wangshumin@centoshostnameKL2 ~]$ hdfs dfs -ls /
Found 6 items
drwxr-xr-x   - wangshumin supergroup          0 2018-02-09 07:32 /data
drwxr-xr-x   - wangshumin supergroup          0 2018-02-09 07:41 /dataload_balance
drwxr-xr-x   - wangshumin supergroup          0 2018-02-09 07:18 /flumedata2
drwxr-xr-x   - wangshumin supergroup          0 2018-02-26 15:35 /hbase
drwx-wx-wx   - wangshumin supergroup          0 2018-03-12 13:04 /tmp
drwxr-xr-x   - wangshumin supergroup          0 2018-02-09 06:50 /user
[wangshumin@centoshostnameKL2 ~]$ hdfs dfs -ls /user/hive/warehouse/hive2_db1.db/
Found 4 items
drwxr-xr-x   - wangshumin supergroup          0 2018-03-12 11:31 /user/hive/warehouse/hive2_db1.db/stu1
drwxr-xr-x   - wangshumin supergroup          0 2018-03-12 11:48 /user/hive/warehouse/hive2_db1.db/stu2
drwxr-xr-x   - wangshumin supergroup          0 2018-03-12 11:48 /user/hive/warehouse/hive2_db1.db/stu3
drwxr-xr-x   - wangshumin supergroup          0 2018-03-12 13:13 /user/hive/warehouse/hive2_db1.db/stuin
[wangshumin@centoshostnameKL2 ~]$ hdfs dfs -ls /user/hive/warehouse/hive2_db1.db/
Found 4 items
drwxr-xr-x   - wangshumin supergroup          0 2018-03-12 11:31 /user/hive/warehouse/hive2_db1.db/stu1
drwxr-xr-x   - wangshumin supergroup          0 2018-03-12 11:48 /user/hive/warehouse/hive2_db1.db/stu2
drwxr-xr-x   - wangshumin supergroup          0 2018-03-12 11:48 /user/hive/warehouse/hive2_db1.db/stu3
drwxr-xr-x   - wangshumin supergroup          0 2018-03-12 13:13 /user/hive/warehouse/hive2_db1.db/stuin
[wangshumin@centoshostnameKL2 ~]$ hdfs dfs -put  stu3 /user/hive/warehouse/hive2_db1.db/stuout
[wangshumin@centoshostnameKL2 ~]$ hdfs dfs -ls /user/hive/warehouse/hive2_db1.db/stuout
-rw-r--r--   3 wangshumin supergroup         53 2018-03-12 13:22 /user/hive/warehouse/hive2_db1.db/stuout
[wangshumin@centoshostnameKL2 ~]$ hdfs dfs -cat  /user/hive/warehouse/hive2_db1.db/stuout
1 , zhangshan , 20
2 , wangwu  , 19
3 , xiaolu  , 26
[wangshumin@centoshostnameKL2 ~]$ hdfs dfs -put  stu3 /user/hive/hivedata
[wangshumin@centoshostnameKL2 ~]$ hdfs dfs -cat   /user/hive/hivedata/stu3
1 , zhangshan , 20
2 , wangwu  , 19
3 , xiaolu  , 26
[wangshumin@centoshostnameKL2 ~]$ hdfs dfs -rm   /user/hive/hivedata/stu3
18/03/12 13:31:51 INFO fs.TrashPolicyDefault: Namenode trash configuration: Deletion interval = 0 minutes, Emptier interval = 0 minutes.
Deleted /user/hive/hivedata/stu3
[wangshumin@centoshostnameKL2 ~]$ hdfs dfs -ls   /user/hive/hivedata
[wangshumin@centoshostnameKL2 ~]$ hdfs dfs -put  stu3 /user/hive/hivedata
[wangshumin@centoshostnameKL2 ~]$ hdfs dfs -cat   /user/hive/hivedata/stu3
1 , zhangshan , 20
2 , wangwu  , 19
3 , xiaolu  , 26
[wangshumin@centoshostnameKL2 ~]$ hdfs dfs -cat   /user/hive/hivedata/stu3
1 , zhangshan , 20
2 , wangwu  , 19
3 , xiaolu  , 26
[wangshumin@centoshostnameKL2 ~]$ hdfs dfs -ls   /user/hive/warehouse/hive2_db1.db
Found 5 items
drwxr-xr-x   - wangshumin supergroup          0 2018-03-12 11:31 /user/hive/warehouse/hive2_db1.db/stu1
drwxr-xr-x   - wangshumin supergroup          0 2018-03-12 11:48 /user/hive/warehouse/hive2_db1.db/stu2
drwxr-xr-x   - wangshumin supergroup          0 2018-03-12 11:48 /user/hive/warehouse/hive2_db1.db/stu3
drwxr-xr-x   - wangshumin supergroup          0 2018-03-12 13:13 /user/hive/warehouse/hive2_db1.db/stuin
-rw-r--r--   3 wangshumin supergroup         53 2018-03-12 13:22 /user/hive/warehouse/hive2_db1.db/stuout
[wangshumin@centoshostnameKL2 ~]$ vim hsql
[wangshumin@centoshostnameKL2 ~]$ cat hsql 
 create external table  stuout( id int , name  String , age  int ) 
 row format delimited 
 fields terminated by ',' 
 location "/user/hive/hivedata"
 ;
[wangshumin@centoshostnameKL2 ~]$ vim  stu2
[wangshumin@centoshostnameKL2 ~]$ pwd
/home/wangshumin
[wangshumin@centoshostnameKL2 ~]$ hdfs dfs -ls  /user/hive/warehouse/hive2_db1.db
Found 6 items
drwxr-xr-x   - wangshumin supergroup          0 2018-03-12 11:31 /user/hive/warehouse/hive2_db1.db/stu1
drwxr-xr-x   - wangshumin supergroup          0 2018-03-12 13:59 /user/hive/warehouse/hive2_db1.db/stu12
drwxr-xr-x   - wangshumin supergroup          0 2018-03-12 11:48 /user/hive/warehouse/hive2_db1.db/stu2
drwxr-xr-x   - wangshumin supergroup          0 2018-03-12 11:48 /user/hive/warehouse/hive2_db1.db/stu3
drwxr-xr-x   - wangshumin supergroup          0 2018-03-12 13:13 /user/hive/warehouse/hive2_db1.db/stuin
-rw-r--r--   3 wangshumin supergroup         53 2018-03-12 13:22 /user/hive/warehouse/hive2_db1.db/stuout
[wangshumin@centoshostnameKL2 ~]$ hdfs dfs -cat  /user/hive/warehouse/hive2_db1.db/stu12
cat: `/user/hive/warehouse/hive2_db1.db/stu12': Is a directory
[wangshumin@centoshostnameKL2 ~]$ hdfs dfs -ls  /user/hive/warehouse/hive2_db1.db/stu12
Found 3 items
drwxr-xr-x   - wangshumin supergroup          0 2018-03-12 13:56 /user/hive/warehouse/hive2_db1.db/stu12/coutr=10000
drwxr-xr-x   - wangshumin supergroup          0 2018-03-12 13:58 /user/hive/warehouse/hive2_db1.db/stu12/coutr=10001
drwxr-xr-x   - wangshumin supergroup          0 2018-03-12 13:59 /user/hive/warehouse/hive2_db1.db/stu12/coutr=10002
[wangshumin@centoshostnameKL2 ~]$ hdfs dfs -ls  /user/hive/warehouse/hive2_db1.db/stu12/coutr=10000
Found 2 items
-rwxr-xr-x   3 wangshumin supergroup         69 2018-03-12 13:55 /user/hive/warehouse/hive2_db1.db/stu12/coutr=10000/stu2
-rwxr-xr-x   3 wangshumin supergroup         69 2018-03-12 13:56 /user/hive/warehouse/hive2_db1.db/stu12/coutr=10000/stu2_copy_1
[wangshumin@centoshostnameKL2 ~]$ hdfs dfs -cat  /user/hive/warehouse/hive2_db1.db/stu12/coutr=10000/stu2
1 ,zhangshan ,20 ,1000
2 ,wangwu  ,  19 ,1100
3 ,xiaolu  ,  26 ,1200
[wangshumin@centoshostnameKL2 ~]$ hdfs dfs -cat  /user/hive/warehouse/hive2_db1.db/stu12/coutr=10000/stu2_copy_1
1 ,zhangshan ,20 ,1000
2 ,wangwu  ,  19 ,1100
3 ,xiaolu  ,  26 ,1200
[wangshumin@centoshostnameKL2 ~]$ 

hive> load data local inpath "/home/wangshumin/stu2"  into  table stu12 partition(coutr="10000");

Loading data to table hive2_db1.stu12 partition (coutr=10000)
Partition hive2_db1.stu12{coutr=10000} stats: [numFiles=1, numRows=0, totalSize=69, rawDataSize=0]
OK
Time taken: 0.716 seconds
hive> load data local inpath "/home/wangshumin/stu2"  into  table stu12 partition(coutr="10000");
Loading data to table hive2_db1.stu12 partition (coutr=10000)
Partition hive2_db1.stu12{coutr=10000} stats: [numFiles=2, numRows=0, totalSize=138, rawDataSize=0]
OK
Time taken: 0.527 seconds
hive> select  * from  stu12;
OK
NULL zhangshan NULL 10000
NULL wangwu   NULL 10000
NULL xiaolu   NULL 10000
NULL zhangshan NULL 10000
NULL wangwu   NULL 10000
NULL xiaolu   NULL 10000
Time taken: 0.089 seconds, Fetched: 6 row(s)
hive> 
    > 
    > 
    > 
    > load data local inpath "/home/wangshumin/stu2"  into  table stu12 partition(coutr="10001");
Loading data to table hive2_db1.stu12 partition (coutr=10001)
Partition hive2_db1.stu12{coutr=10001} stats: [numFiles=1, numRows=0, totalSize=69, rawDataSize=0]
OK
Time taken: 0.51 seconds
hive> select  * from  stu12;
OK
NULL zhangshan NULL 10000
NULL wangwu   NULL 10000
NULL xiaolu   NULL 10000
NULL zhangshan NULL 10000
NULL wangwu   NULL 10000
NULL xiaolu   NULL 10000
NULL zhangshan NULL 10001
NULL wangwu   NULL 10001
NULL xiaolu   NULL 10001
Time taken: 0.131 seconds, Fetched: 9 row(s)
hive> load data local inpath "/home/wangshumin/stu2"  into  table stu12 partition(coutr="10002");
Loading data to table hive2_db1.stu12 partition (coutr=10002)
Partition hive2_db1.stu12{coutr=10002} stats: [numFiles=1, numRows=0, totalSize=69, rawDataSize=0]
OK
Time taken: 0.397 seconds
hive> select  * from  stu12;
OK
NULL zhangshan NULL 10000
NULL wangwu   NULL 10000
NULL xiaolu   NULL 10000
NULL zhangshan NULL 10000
NULL wangwu   NULL 10000
NULL xiaolu   NULL 10000
NULL zhangshan NULL 10001
NULL wangwu   NULL 10001
NULL xiaolu   NULL 10001
NULL zhangshan NULL 10002
NULL wangwu   NULL 10002
NULL xiaolu   NULL 10002
Time taken: 0.084 seconds, Fetched: 12 row(s)
hive> 
    > 
    > 
    > 
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值