hive1.2.1 分区表 增加列显示为null 问题

1.问题 hive中增加字段 发现之后之后添加的数据正常,之前数据无法读取
问题修复:
查找hive元数据的分区字段边 SDS

MySQL [hive]> SELECT * FROM SDS WHERE location LIKE '%video_test%' ;  
+--------+-------+------------------------------------------+---------------+---------------------------+--------------------------------------------------------------------------------+-------------+------------------------------------------------------------+----------+
| SD_ID  | CD_ID | INPUT_FORMAT                             | IS_COMPRESSED | IS_STOREDASSUBDIRECTORIES | LOCATION                                                                       | NUM_BUCKETS | OUTPUT_FORMAT                                              | SERDE_ID |
+--------+-------+------------------------------------------+---------------+---------------------------+--------------------------------------------------------------------------------+-------------+------------------------------------------------------------+----------+
| 153927 |  1580 | org.apache.hadoop.mapred.TextInputFormat |               |                           | hdfs://cluster1/video_test/daily/2018/03/01 |          -1 | org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat |   153927 |
| 154297 |  1580 | org.apache.hadoop.mapred.TextInputFormat |               |                           | hdfs://cluster1/video_test/daily/2018/03/02 |          -1 | org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat |   154297 |
| 154529 |  1580 | org.apache.hadoop.mapred.TextInputFormat |               |                           | hdfs://cluster1/video_test//daily/2017/12/20 |          -1 | org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat |   154529 |
.....
| 180211 |  2880 | org.apache.hadoop.mapred.TextInputFormat |               |                           | hdfs://cluster1/video_test/daily/2018/04/26 |          -1 | org.apache.hadoop.hive.ql.io.HiveIgnoreKeyTextOutputFormat |   180211 |

发现 分区正常的CD_ID 和不正常的CD_ID 是不一样的

修改问同样问题即可解决
UPDATE SDS SET CD_ID=‘2880’ where CD_ID=‘1580’;(操作不当可能损坏元数据)

2.当然也可以删除分区重新add partition (外部表) 风险低 分区多比较麻烦

如果分区路径和分区信息一致 也可尝试msck repair table tablename;(可能导致分区错乱)

3.最简单的办法, 使用CASCADE及联 强制

alter table tablename  add columns (cl1name string COMMENT 'cl1',cl2name string COMMENT 'cl2') CASCADE;

你可能感兴趣的:(大数据,hive)