Re: Spark 1.1.0 and Hive 0.12.0 Compatibility Issue

2014-10-24 Thread arthur.hk.c...@gmail.com
Hi, My Steps: ### HIVE CREATE TABLE CUSTOMER ( C_CUSTKEYBIGINT, C_NAME VARCHAR(25), C_ADDRESSVARCHAR(40), C_NATIONKEY BIGINT, C_PHONE VARCHAR(15), C_ACCTBALDECIMAL, C_MKTSEGMENT VARCHAR(10), C_COMMENTVARCHAR(117) ) row format serde 'com.bizo.hive.serde.csv.CSVSerde';

Spark 1.1.0 and Hive 0.12.0 Compatibility Issue

2014-10-23 Thread Arthur . hk . chan
Hi My Spark is 1.1.0 and Hive is 0.12, I tried to run the same query in both Hive-0.12.0 then Spark-1.1.0, HiveQL works while SparkSQL failed. hive select l_orderkey, sum(l_extendedprice*(1-l_discount)) as revenue, o_orderdate, o_shippriority from customer c join orders o on c.c_mktsegment

Spark 1.1.0 and Hive 0.12.0 Compatibility Issue

2014-10-23 Thread arthur.hk.c...@gmail.com
(Please ignore if duplicated) Hi, My Spark is 1.1.0 and Hive is 0.12, I tried to run the same query in both Hive-0.12.0 then Spark-1.1.0, HiveQL works while SparkSQL failed. hive select l_orderkey, sum(l_extendedprice*(1-l_discount)) as revenue, o_orderdate, o_shippriority from customer c

Re: Spark 1.1.0 and Hive 0.12.0 Compatibility Issue

2014-10-23 Thread Michael Armbrust
Can you show the DDL for the table? It looks like the SerDe might be saying it will produce a decimal type but is actually producing a string. On Thu, Oct 23, 2014 at 3:17 PM, arthur.hk.c...@gmail.com arthur.hk.c...@gmail.com wrote: Hi My Spark is 1.1.0 and Hive is 0.12, I tried to run the