Sqoop是一个用于在Hadoop和关系数据库之间传输数据的工具
将数据从RDBMS导入到Hadoop
HDFS、Hive、HBase
从Hadoop导出数据到RDBMS
将导入或导出命令翻译成MapReduce程序来并行操作和容错
目标用户
系统管理员、数据库管理员
大数据分析师、大数据开发工程师等
MySQL数据导入HDFS
准备工作:mysql中建库建表
mysql> create database retail_db;
mysql> use retail_db;
mysql> source /root/data/sqoop/retail_db.sql
使用sqoop将customers表导入到hdfs上
sqoop import --connect jdbc:mysql://localhost:3306/retail_db --driver com.mysql.jdbc.Driver --table customers --username root --password ok --target-dir /data/retail_db/customers --m 3
报错
Exception in thread “main” java.lang.NoClassDefFoundError: org/json/JSONObject
at org.apache.sqoop.util.SqoopJsonUtil.getJsonStringforMap(SqoopJsonUtil.java:43)
at org.apache.sqoop.SqoopOptions.writeProperties(SqoopOptions.java:784)
at org.apache.sqoop.mapreduce.JobBase.putSqoopOptionsToConfiguration(JobBase.java:392)
at org.apache.sqoop.mapreduce.JobBase.createJob(JobBase.java:378)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:256)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:513)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:621)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
Caused by: java.lang.ClassNotFoundException: org.json.JSONObject
拷贝jar包
cp /opt/software/java-json.jar /opt/install/sqoop-1.4.6-cdh5.14.2/lib/
重新执行
使用where过滤
sqoop import \
--connect jdbc:mysql://localhost:3306/retail_db \
--driver com.mysql.jdbc.Driver \
--table orders \
--where "order_id<100" \
--username root \
--password root \
--delete-target-dir \
--target-dir /data/retail_db/orders \
--m 3
使用columns过滤
sqoop import \
--connect jdbc:mysql://localhost:3306/retail_db \
--driver com.mysql.jdbc.Driver \
--table customers \
--columns "customer_id,customer_fname,customer_lname" \
--username root \
--password root \
--delete-target-dir \
--target-dir /data/retail_db/customers \
--m 3
使用查询语句进行过滤
sqoop import \
--connect jdbc:mysql://localhost:3306/retail_db \
--driver com.mysql.jdbc.Driver \
select * from orders ;
并行计算,–m 3 ,假设只有90个order_id
select * from orders where order>=0 and order <30;
select * from orders where order>=30 and order <60;
select * from orders where order>=60 and order <90;
如果按照日期进行指定的话
假设有3天,其中第一天10订单,第二天50订单,第3天20订单
select * from orders where order_date>=2020-7-20 and order_date<2020-7-21;
select * from orders where order_date>=2020-7-21 and order_date<2020-7-22;
select * from orders where order_date>=2020-7-22 and order_date<2020-7-23;
**增量导入**
sqoop import \
--connect jdbc:mysql://localhost:3306/retail_db \
--driver com.mysql.jdbc.Driver \
--table orders \
--username root \
--password root \
--incremental append \
--check-column order_date \
--last-value '2013-07-24 00:00:00'
--target-dir /data/retail_db/orders \
--m 3
SELECT MIN(order_id), MAX(order_id) FROM orders WHERE ( order_date > ‘2013-07-24 00:00:00’ AND order_date <= ‘2014-07-24 00:00:00.0’ )
insert into orders values(99999,‘2015-05-30 00:00:00’,11599,‘CLOSED’);
sqoop import \
--connect jdbc:mysql://localhost:3306/retail_db \
--driver com.mysql.jdbc.Driver \
--table orders \
--username root \
--password root \
--incremental append \
--check-column order_date \
--last-value '0'
--target-dir /data/retail_db/orders \
--m 3
SELECT MIN(order_id), MAX(order_id) FROM orders WHERE ( order_date > ‘2014-07-24 00:00:00.0’ AND order_date <= ‘2015-05-30 00:00:00.0’ )
创建job
sqoop job --create mysql2hdfs \
-- import \
--connect jdbc:mysql://localhost:3306/retail_db \
--driver com.mysql.jdbc.Driver \
--table orders \
--username root \
--password root \
--incremental append \
--check-column order_date \
--last-value '0' \
--target-dir /data/retail_db/orders \
--m 3
查看job
sqoop job --list
执行job
sqoop job --exec mysql2hdfs
insert into orders values(999999,‘2016-05-30 00:00:00’,11599,‘CLOSED’);
SELECT MIN(order_id), MAX(order_id) FROM orders WHERE ( order_date > ‘2015-05-30 00:00:00.0’ AND order_date <= ‘2016-05-30 00:00:00.0’ )
每次job执行成功之后都会修改 --last-value 值 将最后一次的最大值填充进去
这里的 ‘0’ 没有实际含义,只是为了保证第一次数据导入时值最小
每天可以定时执行
crontab -e
- 2 */1 * * sqoop job --exec mysql2hdfs
导入数据到Hive中
先在Hive中创建表
hive -e “create database if not exists retail_db;”
```c
```c
```c
```c
```c
```c
sqoop import \
--connect jdbc:mysql://localhost:3306/retail_db \
--driver com.mysql.jdbc.Driver \
--table orders \
--username root \
--password root \
--hive-import \
--create-hive-table \
--hive-database retail_db \
--hive-table orders \
--m 3
报错,输出路径已存在
20/07/15 22:29:04 ERROR tool.ImportTool: Import failed: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://hadoop1:9000/user/root/orders already exists
at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.checkOutputSpecs(FileOutputFormat.java:146)
at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:270)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:143)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1307)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1304)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1304)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1325)
at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:203)
at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:176)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:273)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:513)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:621)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
删除已存在的目录
[root@hadoop1 lib]# hdfs dfs -rmr hdfs://hadoop1:9000/user/root/orders
又报错
20/07/15 22:26:56 ERROR tool.ImportTool: Import failed: java.io.IOException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:50)
at org.apache.sqoop.hive.HiveImport.getHiveArgs(HiveImport.java:392)
at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:379)
at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:337)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:530)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:621)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
缺少hive-common-1.1.0-cdh5.14.2.jar,所以从Hive的lib中进行拷贝
cp /opt/install/hive-1.1.0-cdh5.14.2/lib/hive-common-1.1.0-cdh5.14.2.jar /opt/install/sqoop-1.4.6-cdh5.14.2/lib/
再执行又报错
Exception in thread “main” java.lang.NoClassDefFoundError: org/apache/hadoop/hive/shims/ShimLoader
at org.apache.hadoop.hive.conf.HiveConf$ConfVars.(HiveConf.java:370)
at org.apache.hadoop.hive.conf.HiveConf.(HiveConf.java:108)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:264)
at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:44)
at org.apache.sqoop.hive.HiveImport.getHiveArgs(HiveImport.java:392)
at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:379)
at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:337)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:530)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:621)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.shims.ShimLoader
缺少jar包,需要从Hive中进行拷贝
cp /opt/install/hive-1.1.0-cdh5.14.2/lib/hive-shims* /opt/install/sqoop-1.4.6-cdh5.14.2/lib/
**
导入数据到Hive分区中
**
删除Hive表
drop table if exists orders;
导入
sqoop import
–connect jdbc:mysql://localhost:3306/retail_db
–driver com.mysql.jdbc.Driver
–query “select order_id,order_status from orders where order_date>=‘2013-11-03’ and order_date <‘2013-11-04’ and $CONDITIONS”
–username root
–password root
–delete-target-dir
–target-dir /data/retail_db/orders
–split-by order_id
–hive-import
–hive-database retail_db
–hive-table orders
–hive-partition-key “order_date”
–hive-partition-value “2013-11-03”
–m 3
注意:分区字段不能当成普通字段导入表中
导入数据到HBase中
1.在HBase中建表
create ‘products’,‘data’,‘category’
2.sqoop导入
sqoop import
–connect jdbc:mysql://localhost:3306/retail_db
–driver com.mysql.jdbc.Driver
–username root
–password root
–table products
–hbase-table products
–column-family data
–m 3
HDFS 向MySQL中导出数据
1.MySQL中建表
create table customers_demo as select * from customers where 1=2;
2.上传数据
hdfs dfs -mkdir /customerinput
hdfs dfs -put customers.csv /customerinput
3.导出数据
sqoop export
–connect jdbc:mysql://localhost:3306/retail_db
–driver com.mysql.jdbc.Driver
–username root
–password root
–table customers_demo
–export-dir /customerinput
–m 1
sqoop 脚本
1.编写脚本 job_RDBMS2HDFS.opt
import
–connect
jdbc:mysql://localhost:3306/retail_db
–driver
com.mysql.jdbc.Driver
–table
customers
–username
root
–password
root
–target-dir
/data/retail_db/customers
–delete-target-dir
–m
3
2.执行脚本
sqoop --options-file job_RDBMS2HDFS.opt