Hive之With a as 和 case when 和 join on 和 group by 联用案例(10)

一   数据源,将数据添加到 linux 本地,生成静态文件

数据源
用户工资组成表    sal.txt
uid jb  jj   tc deptno    tb_sal
1,2000,3000,1500,1
2,5000,500,1000,2
3,1500,1000,3000,2
4,3000,6000,8000,3
5,1500,2000,1800,1
6,2500,1000,1900,1

bm.txt   部门表         tb_departmen
1,销售
2,技术
3,行政

员工信息表              tb_employee
yg.txt
uid name gender age
1,zs,M,28
2,ww,F,36
3,zl,F,48
4,pp,M,44
5,wb,M,32
6,TQ,F,32

将三个结构化数据导入到本地 /root/hive/company/ 目录下
[root@linux03 companydata]# vi sal.txt             --工资表信息
[root@linux03 companydata]# vi department.txt      --部门表信息
[root@linux03 companydata]# vi employee.txt        --员工表信息
[root@linux03 companydata]# ll                     --查看文件
-rw-r--r--. 1 root root  19 Sep  3 01:13 department.txt
-rw-r--r--. 1 root root  79 Sep  3 01:12 employee.txt
-rw-r--r--. 1 root root 105 Sep  3 01:14 sal.txt

二   建表 ,将数据加载到表里面 ,然后查询数据加载情况

0: jdbc:hive2://linux03:10000>
删除tb_sal ,重新创建一个tb_sal ,然后将sal.txt文件数据加载到表里面
drop table tb_sal;
create table tb_sal(
uid int,
jb int,
jj int,
tc int,
deptno int)
row format delimited fields terminated by ",";
load data local inpath "/root/hive/companydata/sal.txt" into table tb_sal;

删除tb_employee ,重新创建一个tb_employee ,然后将employee.txt文件数据加载到表里面
drop table tb_employee;
create table tb_employee(
uid int,
name string,
gender string,
age int)
row format delimited fields terminated by ",";
load data local inpath "/root/hive/companydata/employee.txt" into table tb_employee;

删除tb_department ,重新创建一个tb_department ,然后将department.txt文件数据加载到表里面
drop table tb_department;
create table tb_department(
deptno int,
bname string)
row format delimited fields terminated by ",";
load data local inpath "/root/hive/companydata/department.txt" into table tb_department;

查看数据插入情况
0: jdbc:hive2://linux03:10000> select * from tb_sal;
+-------------+------------+------------+------------+----------------+
| tb_sal.uid  | tb_sal.jb  | tb_sal.jj  | tb_sal.tc  | tb_sal.deptno  |
+-------------+------------+------------+------------+----------------+
| 1           | 2000       | 3000       | 1500       | 1              |
| 2           | 5000       | 500        | 1000       | 2              |
| 3           | 1500       | 1000       | 3000       | 2              |
| 4           | 3000       | 6000       | 8000       | 3              |
| 5           | 1500       | 2000       | 1800       | 1              |
| 6           | 2500       | 1000       | 1900       | 1              |
+-------------+------------+------------+------------+----------------+
0: jdbc:hive2://linux03:10000>select * from tb_employee;
+------------------+-------------------+---------------------+------------------+
| tb_employee.uid  | tb_employee.name  | tb_employee.gender  | tb_employee.age  |
+------------------+-------------------+---------------------+------------------+
| 1                | zs                | M                   | 28               |
| 2                | ww                | F                   | 36               |
| 3                | zl                | F                   | 48               |
| 4                | pp                | M                   | 44               |
| 5                | wb                | M                   | 32               |
| 6                | TQ                | F                   | 32               |
+------------------+-------------------+---------------------&#
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值