sqoop import -D mapred.job.queue.name=root.myqueue
–connect jdbc:oracle:thin:@192.168.1.128:1521:mydatabase
–username jsz
–password 123456
–table mys.test
–columns ID,NAME,AGE,SALARY,DATA_DATE
-m 1
–hive-table default.test
–create-hive-table
–hive-drop-import-delims
–verbose
–fetch-size 5000
–target-dir /apps/testdata/default/test
–delete-target-dir
–hive-overwrite
–null-string ‘\N’
–null-non-string ‘\N’
–hive-import;
- Hive表有分区,增量从Oracle中抽取数据到Hive表中:
sqoop import -D mapred.job.queue.name=root.myqueue
–connect jdbc:oracle:thin:@192.168.1.128:1521:mydatabase
–username jsz
–password 123456
–table mys.test
–columns ID,NAME,AGE,SALARY,LCD
-m 1
–hive-partition-key op_day
–hive-partition-value 20160525
–where “lcd >= trunc(TO_DATE(‘20160525’,‘YYYY-MM-DD’),‘dd’) and lcd < trunc(TO_DATE(‘20160526’,‘YYYY-MM-DD’),‘dd’)”
–hive-table default.test_partition
–create-hive-table
–hive-drop-import-delims
–verbose
–fetch-size 5000
–target-dir /apps/testdata/default/test_partition
–delete-target-dir
–hive-overwrite
–null-string ‘\N’
–null-non-string ‘\N’
–hive-import;
3.使用select语句:
sqoop import
-D mapred.job.queue.name=root.myqueue
-D oracle.sessionTimeZone=America/Los_Angeles
–connect jdbc:oracle:thin:@192.168.1.128:1521:mydatabase
–username jsz
–password 123456
–query “select ID,NAME,AGE,SALARY,DATA_DATE from mys.test WHERE ID = ‘10086’ and $CONDITIONS”
–verbose
–fetch-size 5000
–hive-table default.test
–target-dir /apps/testdata/default/test
–delete-target-dir
–fields-terminated-by ‘\001’
–lines-terminated-by ‘\n’
-m 1
–hive-import
–hive-overwrite
–null-string ‘\N’
–null-non-string ‘\N’
–hive-drop-import-delims
Sqoop的export工具导数的情况:
将Hive的数据导入Oracle,整个过程分为三步:
-
删除Oracle表的历史数据
sqoop eval -D mapred.job.queue.name=root.myqueue
–connect jdbc:oracle:thin:@192.168.1.128:1521:mydatabase
–username jsz
–password 123456
–verbose
–e “delete from mys.test” -
Hive导出到HDFS指定路径
hive -e “use default;set mapred.job.queue.name=root.myqueue;set hive.insert.into.multilevel.dirs=true;insert overwrite directory ‘/apps/testdata/default/test’ select id,name,age,salary,data_date from default.test;” -
将HDFS的数据导入到Oracle中
sqoop export
-D mapred.job.queue.name=root.myqueue
-D mapred.task.timeout=0
–connect jdbc:oracle:thin:@192.168.1.128:1521:mydatabase
–username jsz
–password 123456
–table mys.test
–columns ID,NAME,AGE,SALARY,DATA_DATE
–export-dir /apps/testdata/default/test
–verbose
–input-null-string ‘\N’
–input-null-non-string ‘\N’
–input-fields-terminated-by ‘\001’
–input-lines-terminated-by ‘\n’
-m 1
链接!
Hive
hive -e “set mapred.job.queue.name=root.zm_yarn_pool.production;select count(*) from ods.ods_zmnode_table;”;