1.写分区文件失败
错误日志
出现org.apache.spark.sql.AnalysisException: org.apache.hadoop.hive.ql.metadata.HiveException: org.apache.hadoop.hive.ql.metadata.HiveException: Load Data failed for
a.HiveException: Load Data failed for hdfs://chinacreator/warehouse/tablespace/managed/hive/qt_debug_dt/.hive-staging_hive_2021-04-23_13-14-00_293_4321314254221670332-1/-ext-10000/org_id=qa1 as the file is not owned by hive and load data is also not ran as hive
解决方式:spark-shell是用root提交的话,
<property>
<name>hive.load.data.owner</name>
<value>root</value>
</property>
spark-shell是用spark提交的话,
<property>
<name>hive.load.data.owner</name>
<value>spark</value>
</property>
首先是先看清日志提示,反复想,再去百度找了很多资料,不然很多找不到答案
权限问题:
<property>