Nirmalkumar created HIVE-14844:
----------------------------------
Summary: Not able to create the Hive table with more than 700
columns
Key: HIVE-14844
URL: https://issues.apache.org/jira/browse/HIVE-14844
Project: Hive
Issue Type: Bug
Components: Metastore
Affects Versions: 1.1.0
Environment: MySQL
Reporter: Nirmalkumar
We tried creating the Hive table with 700+ columns which leads to the below
error:
FAILED: Execution Error, return code 1 from
org.apache.hadoop.hive.ql.exec.DDLTask.
MetaException(message:javax.jdo.JDODataStoreException: Put request failed :
INSERT INTO `SERDE_PARAMS` (`PARAM_VALUE`,`SERDE_ID`,`PARAM_KEY`) VALUES (?,?,?)
we use hive 1.1.0 and below is the SERDE properties we have in our MySQL.
mysql> desc SERDE_PARAMS;
+-------------+---------------+------+-----+---------+-------+
| Field | Type | Null | Key | Default | Extra |
+-------------+---------------+------+-----+---------+-------+
| SERDE_ID | bigint(20) | NO | PRI | NULL | |
| PARAM_KEY | varchar(256) | NO | PRI | NULL | |
| PARAM_VALUE | varchar(4000) | YES | | NULL | |
+-------------+---------------+------+-----+---------+-------+
As per Cloudera,
Actually it's a known limitation and there are no patches for this.
To avoid this issue typically is increased the size of the field in the
metastore, BUT this is not a supported Cloudera solution that can result in a
break in a future Metastore schema upgrade (in case if we are evolving our CDH
version to an upper one that will implement an hive version greater than the
actual one).
So could you please provide a fix for this?
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)