最近生产环境出现了一个错误,spark无法写入数据的到hive报以下错误
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Invalid partition for table orc_report_behavior
at org.apache.hadoop.hive.ql.metadata.Partition.initialize(Partition.java:208)
at org.apache.hadoop.hive.ql.metadata.Partition.<init>(Partition.java:106)
at org.apache.hadoop.hive.ql.metadata.Hive.getAllPartitionsOf(Hive.java:2103)
... 194 more
Caused by: MetaException(message:Invalid partition key & values; keys [day, ], values [])
at org.apache.hadoop.hive.metastore.Warehouse.makePartName(Warehouse.java:550)
at org.apache.hadoop.hive.metastore.Warehouse.makePartName(Warehouse.java:483)
at org.apache.hadoop.hive.ql.metadata.Partition.initialize(Partition.java:192)
... 196 more
首先怀疑hive进程问题,直接通过hive查询表数据,结果还是包以上错误。检查数据源发现有乱码,百度了下,找到解决方案,特记录下。
解决方法
删除所有表分区
#!/bin/bash
source /etc/p