通过Hive外部表迁移Kudu数据表
由于数据量越来越大,服务器难以支撑所以迁移到云上,记录下迁移kudu表的过程
1、导出数据为csv文件:
`impala-shell -q "select * from vs_kudu_xxx_days" -B --output_delimiter="," -o /opt/vs_kudu_xxx_days_0131.csv
2、scp csv文件到云服务器
3、创建hive临时表
CREATE TABLE `vs_hive_xxx_days_temp` (
`id` string,
`aaa_count` int,
`avg_time` bigint,
`leave_time` string,
resident_amount int
) ROW FORMAT DELIMITED
FIELDS TERMINATED BY ','
STORED AS TEXTFILE;
4、将csv文件导入hive外部表
load data local inpath '/opt/vs_kudu_xxx_days_0131.csv' into table vs_hive_xxx_days_temp;
3、4两步可以直接通过上传csv到HDFS然后建外部表指定csv位置方式实现,相同效果
5、将csv文件导入hive外部表
load data local inpath '/opt/vs_kudu_xxx_days_0131.csv' into table vs_hive_xxx_days_t