hive表在创建时候指定存储格式
STORED AS ORC tblproperties ('orc.compress'='SNAPPY');
当insert数据到表时抛出异常
Caused by: java.lang.ClassCastException: org.apache.hadoop.io.Text cannot be cast to org.apache.hadoop.hive.ql.io.orc.OrcSerde$OrcSerdeRow at org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat$OrcRecordWriter.write(OrcOutputFormat.java:98) at org.apache.hadoop.hive.ql.exec.FileSinkOperator.process(FileSinkOperator.java:743) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:837) at org.apache.hadoop.hive.ql.exec.SelectOperator.process(SelectOperator.java:97) at org.apache.hadoop.hive.ql.exec.Operator.forward(Operator.java:837) at org.apache.hadoop.hive.ql.exec.TableScanOperator.process(TableScanOperator.java:115) at org.apache.hadoop.hive.ql.exec.MapOperator$MapOpCtx.forward(MapOperator.java:169) at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:561)
此时查看表结构
desc formatted persons_orc;
可以看到SerDe Library 的格式是LazySimpleSerDe,序列化格式不是orc的,所以抛出异常
这里将表的序列化方式修改为orc即可
ALTER TABLE persons_orc SET FILEFORMAT ORC;
再看序列化格式已经是orc,使用insert(insert overwrite table persons_orc select * from persons;)插入数据可以ok
可以参考详细解释:http://www.imooc.com/article/252830