FAILED: SemanticException [Error 10006]: Partition not found 怎么解决删除分区表中分区数据?
测试环境是:hadoop2.6.1+hive1.2.1+spark-1.6.0
0: jdbc:hive2://2.hadoop.com:10008> insert overwrite table t_service_detail_log_txt PARTITION ( fdate="2016-01-06" ) select * from t_service_detail_log where fdate="2016-01-06";
0: jdbc:hive2://2.hadoop.com:10008> select count(1) from t_service_detail_log_txt where fdate='2016-01-06';
+-----------+--+
| _c0 |
+-----------+--+
| 44310448 |
+-----------+--+
1 row selected (24.948 seconds)
0: jdbc:hive2://2.hadoop.com:10008>
0: jdbc:hive2://2.hadoop.com:10008> ALTER TABLE t_service_detail_log_txt DROP IF EXISTS PARTITION(fdate='2016-01-06');
Error: org.apache.spark.sql.execution.QueryExecutionException: FAILED: SemanticException [Error 10006]: Partition not found (fdate = 2016-01-06) (state=,code=0)
怎么解决删除分区表中分区数据?