目录
一、背景
执行datax从pg库同步到hive库时发生报错
ERROR WriterRunner - Writer Runner Received Exceptions:
java.lang.IllegalArgumentException: No enum constant com.alibaba.datax.plugin.writer.hdfswriter.SupportHiveDataType.LONG
二、定位问题
从报错内容可知是因为枚举值出现了问题,关键词long,可知是数值类型有问题。仔细排查发现是json配置类型有问题,将bigint类型配置成了long。
三、模板示例
--从pg库同步到hive库
{
"job": {
"content": [
{
"reader": {
"name": "postgresqlreader",
"parameter": {
"connection": [
{
"jdbcUrl": ["jdbc:postgresql://ip:host/db_name"],
"querySql": ["SELECT col1,col2,col3 FROM public.table_name"],
}
],
"username": "usr",
"password": "pwd"
}
},
"writer": {
"name": "hdfswriter",
"parameter": {
"defaultFS": "hdfs://ip:host",
"fileType": "orc",
"path": "/warehouse/tablespace/external/hive/db_name.db/table_name",
"fileName": "table_name",
"column": [
{"name":"col1","type":"bigint"},
{"name":"col2","type":"string"},
{"name":"col3","type":"double"}
],
"writeMode": "append/truncate",
"fieldDelimiter": "\t",
"encoding": "utf-8"
}
}
}],
"setting": {
"speed": {
"channel": "3"
},
"errorLimit": {
"record": 0,
"percentage": 0.02
}
}
--hive同步到pg
{
"job": {
"setting": {
"speed": {
"channel": 1
},
"errorLimit": {
"record": 0,
"percentage": 0.02
}
},
"content": [{
"reader": {
"name": "hdfsreader",
"parameter": {
"path": "/warehouse/tablespace/external/hive/db_name.db/table_name",
"defaultFS": "hdfs://ip:host",
"fileType": "orc",
"column": [
{"index": 0,"type": "string"},
{"index": 1,"type": "string"},
{"index": 2,"type": "string"}
],
"fileType": "orc",
"encoding": "UTF-8",
"fieldDelimiter": "\t"
}
},
"writer": {
"name": "postgresqlwriter",
"parameter": {
"print": true,
"encoding": "UTF-8",
"username": "usr",
"password": "pwd",
"column": [
"col1",
"col2",
"col3"
],
"connection": [
{
"jdbcUrl": "jdbc:postgresql://ip:host/db_name",
"table": ["public.table_name"]
}
],
"writeMode": "update (col1)"
}
}
}
]
}
}