Sorry for missing your original email - thanks for the catch, eh?!
On Thu, Sep 25, 2014 at 7:14 AM, arthur.hk.c...@gmail.com
arthur.hk.c...@gmail.com wrote:
Hi,
Fixed the issue by downgrade hive from 13.1 to 12.0, it works well now.
Regards
On 31 Aug, 2014, at 7:28 am,
Hi Michael,
Thank you so much!!
I have tried to change the following key length from 256 to 255 and from 767 to
766, it still didn’t work
alter table COLUMNS_V2 modify column COMMENT VARCHAR(255);
alter table INDEX_PARAMS modify column PARAM_KEY VARCHAR(255);
alter table SD_PARAMS modify column
Oh, you may be running into an issue with your MySQL setup actually, try running
alter database metastore_db character set latin1
so that way Hive (and the Spark HiveContext) can execute properly against the
metastore.
On August 29, 2014 at 04:39:01, arthur.hk.c...@gmail.com
Hi,
Already done but still get the same error:
(I use HIVE 0.13.1 Spark 1.0.2, Hadoop 2.4.1)
Steps:
Step 1) mysql:
alter database hive character set latin1;
Step 2) HIVE:
hive create table test_datatype2 (testbigint bigint );
OK
Time taken: 0.708 seconds
hive drop table test_datatype2;
Hi,
Tried the same thing in HIVE directly without issue:
HIVE:
hive create table test_datatype2 (testbigint bigint );
OK
Time taken: 0.708 seconds
hive drop table test_datatype2;
OK
Time taken: 23.272 seconds
Then tried again in SPARK:
scala val hiveContext = new
Spark SQL is based on Hive 12. They must have changed the maximum key size
between 12 and 13.
On Fri, Aug 29, 2014 at 4:38 AM, arthur.hk.c...@gmail.com
arthur.hk.c...@gmail.com wrote:
Hi,
Tried the same thing in HIVE directly without issue:
HIVE:
hive create table test_datatype2
(Please ignore if duplicated)
Hi,
I use Spark 1.0.2 with Hive 0.13.1
I have already set the hive mysql database to latine1;
mysql:
alter database hive character set latin1;
Spark:
scala val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc)
scala hiveContext.hql(create table