[ 
https://issues.apache.org/jira/browse/HIVE-6784?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13954290#comment-13954290
 ] 

Brock Noland commented on HIVE-6784:
------------------------------------

FYI [~szehon]

> parquet-hive should allow column type change
> --------------------------------------------
>
>                 Key: HIVE-6784
>                 URL: https://issues.apache.org/jira/browse/HIVE-6784
>             Project: Hive
>          Issue Type: Bug
>          Components: File Formats, Serializers/Deserializers
>    Affects Versions: 0.13.0
>            Reporter: Tongjie Chen
>
> see also in the following parquet issue:
> https://github.com/Parquet/parquet-mr/issues/323
> Currently, if we change parquet format hive table using "alter table 
> parquet_table change c1 c1 bigint " ( assuming original type of c1 is int), 
> it will result in exception thrown from SerDe: 
> "org.apache.hadoop.io.IntWritable cannot be cast to 
> org.apache.hadoop.io.LongWritable" in query runtime.
> This is different behavior from hive (using other file format), where it will 
> try to perform cast (null value in case of incompatible type).



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to