Mohammad Kamrul Islam created HIVE-12475:
--------------------------------------------
Summary: Parquet schema evolution within array<struct<>> doesn't
work
Key: HIVE-12475
URL: https://issues.apache.org/jira/browse/HIVE-12475
Project: Hive
Issue Type: Bug
Components: File Formats
Affects Versions: 1.1.0
Reporter: Mohammad Kamrul Islam
Assignee: Mohammad Kamrul Islam
If we create a table with type array<struct<>>, and later added a field in the
struct, we got the following exception.
The following SQL statements would recreate the error:
{quote}
CREATE TABLE pq_test (f1 array<struct<c1:int,c2:int>>) STORED AS PARQUET;
INSERT INTO TABLE pq_test select array(named_struct("c1",1,"c2",2)) FROM tmp
LIMIT 2;
SELECT * from pq_test;
ALTER TABLE pq_test REPLACE COLUMNS (f1
array<struct<c1:int,c2:int,cccccc:int>>); //***** cccccc
SELECT * from pq_test;
{quote}
Exception:
{quote}
Caused by: java.lang.ArrayIndexOutOfBoundsException: 2
at
org.apache.hadoop.hive.ql.io.parquet.serde.ArrayWritableObjectInspector.getStructFieldData(ArrayWritableObjectInspector.java:142)
at
org.apache.hadoop.hive.serde2.SerDeUtils.buildJSONString(SerDeUtils.java:363)
at
org.apache.hadoop.hive.serde2.SerDeUtils.buildJSONString(SerDeUtils.java:316)
at
org.apache.hadoop.hive.serde2.SerDeUtils.getJSONString(SerDeUtils.java:199)
at
org.apache.hadoop.hive.serde2.DelimitedJSONSerDe.serializeField(DelimitedJSONSerDe.java:61)
at
org.apache.hadoop.hive.serde2.lazy.LazySimpleSerDe.doSerialize(LazySimpleSerDe.java:236)
at
org.apache.hadoop.hive.serde2.AbstractEncodingAwareSerDe.serialize(AbstractEncodingAwareSerDe.java:55)
at
org.apache.hadoop.hive.ql.exec.DefaultFetchFormatter.convert(DefaultFetchFormatter.java:71)
at
org.apache.hadoop.hive.ql.exec.DefaultFetchFormatter.convert(DefaultFetchFormatter.java:40)
at
org.apache.hadoop.hive.ql.exec.ListSinkOperator.process(ListSinkOperator.java:89)
{quote}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)