Column length not sufficient for large STRUCT definitions
---------------------------------------------------------
Key: HIVE-1632
URL: https://issues.apache.org/jira/browse/HIVE-1632
Project: Hadoop Hive
Issue Type: Bug
Components: Metastore
Affects Versions: 0.5.0
Reporter: Wolfgang Nagele
Priority: Trivial
Can be reproduced by adding the following table:
{code}hive> CREATE TABLE test (big struct<prop1: int,
prop2: int,
prop3: int,
prop4: int,
prop5: int,
prop6: int,
prop7: int,
prop8: int,
prop9: int,
prop10: int,
prop10: int,
prop11: int,
prop12: int,
prop13: int,
prop14: int,
prop15: int,
prop16: int,
prop17: int,
prop18: int,
prop19: int>);{code}
Error:
{noformat}FAILED: Error in metadata: javax.jdo.JDODataStoreException: Add
request failed : INSERT INTO COLUMNS
(SD_ID,COMMENT,"COLUMN_NAME",TYPE_NAME,INTEGER_IDX) VALUES (?,?,?,?,?)
NestedThrowables:
java.sql.SQLDataException: A truncation error was encountered trying to shrink
VARCHAR 'struct<prop1:int,prop2:int,prop3:int,prop4:int,prop5:int,pro&' to
length 128.
FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask{noformat}
Workaround:
Change column length in metastore. Derby example: {{ALTER TABLE columns ALTER
type_name SET DATA TYPE VARCHAR(1000);}}
--
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.