hive-issue-inserting-records-to-partitioned-table

时间:2024-08-31 09:05:56

hive-issue-inserting-records-to-partitioned-table

Hi Sam,

Recently we upgraded our cluster from HDP2.5.6 to HDP2.6.4 and I am getting the similar error. With the current version Hive is more stricter on INSERT OVERWRITE TABLE. What it means is you might be deleting the data prior to loading the table and not dropping the partition when you do INSERT OVERWRITE TABLE.

To get around it,

Try to delete the data and drop partition,prior to running the INSERT OVERWRITE TABLE.

OR don't delete the data/drop partition for the external table let the INSERT OVERWRITE TABLE replace it.

Regards

Khaja Hussain.

Similar Error:

Caused by: java.util.concurrent.ExecutionException: org.apache.hadoop.hive.ql.metadata.HiveException: Destination directory hdfs://data_dir/pk_business=bsc/pk_data_source=pos/pk_frequency=bnw/pk_data_state=c13251_ps2111_bre000_pfc00000_spr000_pfs00000/pk_reporttype=BNN/pk_ppweek=2487 has not be cleaned up.