No, definitely not. A Hive table with Sequence Files stored in hdfs:
/user/warehouse/
Kind Regards
Timothy Garza
Data Integration Developer
Collinson Technology Services
Skype: timothy.garza.cts
collinsongroup.com<http://www.collinsongroup.com/>
[Collinson Group
I find the same thing, especially with Hive v1.2.1 that I am currently
trialling. It does lead to issues with the Metastore when trying to re-use the
same Hive Table name and I find manually deleting the files in HDFS serves as a
workaround.
Q. What does that have to do with the text in the Sub
The following JIRA<https://issues.apache.org/jira/browse/HIVE-12553> refers:
https://issues.apache.org/jira/browse/HIVE-12553
From: Timothy Garza [mailto:timothy.ga...@collinsongroup.com]
Sent: 01 December 2015 12:44
To: user@hive.apache.org
Subject: RE: UPD
de your mysql connector.
Daniel
On Mon, Nov 30, 2015 at 8:12 PM, Timothy Garza
mailto:timothy.ga...@collinsongroup.com>>
wrote:
We’ve been playing with the MySQL Global Settings: (Hive metastore)
mysql> set global innodb_large_prefix = ON; (<-- this was set to OFF
previously)
…and
We’ve been playing with the MySQL Global Settings: (Hive metastore)
mysql> set global innodb_large_prefix = ON; (<-- this was set to OFF
previously)
…and now the ERROR is thus:
Specified key was too long; max key length is 3072 bytes
So it’s still ‘failing’ (but the HDFS operation itself succe
Weirdly I’m experiencing exactly the same issue when trying to populate a Hive
Table using INSERT OVERWRITE TABLE. We’re recently upgraded from Hive 0.13 to
1.2.1. NB. The Hive Table populates but the map-reduce returns an error code. I
have run the hive Schema Tool: schematool -dbType mysql -
Weirdly I’m experiencing exactly the same issue when trying to populate a Hive
Table using INSERT OVERWRITE TABLE. We’re recently upgraded from Hive 0.13 to
1.2.1. NB. The Hive Table populates but the map-reduce returns an error code. I
have run the hive Schema Tool: schematool -dbType mysql -
2015 at 7:13 AM, Timothy Garza
mailto:timothy.ga...@collinsongroup.com>>
wrote:
I should mention that all mentioned .jar files are located in the directories
specified with the following privs: -rw-r--r-- 1 hadoop hadoop
I’ve installed Hive 1.2.1 on Amazon Linux AMI release 2015.03, m
I should mention that all mentioned .jar files are located in the directories
specified with the following privs: -rw-r--r-- 1 hadoop hadoop
I’ve installed Hive 1.2.1 on Amazon Linux AMI release 2015.03, master-node of
Hadoop cluster.
I can successfully access the Beeline client but when
I’ve installed Hive 1.2.1 on Amazon Linux AMI release 2015.03, master-node of
Hadoop cluster.
I can successfully access the Beeline client but when I try to connect to
Hive-Server2… beeline
Ø !connect jdbc://hive2:// :1 org.apache.hive.jdbc.HiveDriver
I get the following error:
No known d
10 matches
Mail list logo