SparkSQL on hive error

2015-10-27 Thread Anand Nalya
Hi, I've a partitioned table in Hive (Avro) that I can query alright from hive cli. When using SparkSQL, I'm able to query some of the partitions, but getting exception on some of the partitions. The query is: sqlContext.sql("select * from myTable where source='http' and date =

RE: SparkSQL on hive error

2015-10-27 Thread Cheng, Hao
Hi Anand, can you paste the table creating statement? I’d like to reproduce that in my local first, and BTW, which version are you using? Hao From: Anand Nalya [mailto:anand.na...@gmail.com] Sent: Tuesday, October 27, 2015 11:35 PM To: spark users Subject: SparkSQL on hive error Hi, I've

Re: SparkSQL on Hive error

2014-10-13 Thread Kevin Paul
Thanks Michael, your patch works for me :) Regards, Kelvin Paul On Fri, Oct 3, 2014 at 3:52 PM, Michael Armbrust mich...@databricks.com wrote: Are you running master? There was briefly a regression here that is hopefully fixed by spark#2635 https://github.com/apache/spark/pull/2635. On Fri,

SparkSQL on Hive error

2014-10-03 Thread Kevin Paul
Hi all, I tried to launch my application with spark-submit, the command I use is: bin/spark-submit --class ${MY_CLASS} --jars ${MY_JARS} --master local myApplicationJar.jar I've buillt spark with SPARK_HIVE=true, and was able to start HiveContext, and was able to run command like,

Re: SparkSQL on Hive error

2014-10-03 Thread Michael Armbrust
Are you running master? There was briefly a regression here that is hopefully fixed by spark#2635 https://github.com/apache/spark/pull/2635. On Fri, Oct 3, 2014 at 1:43 AM, Kevin Paul kevinpaulap...@gmail.com wrote: Hi all, I tried to launch my application with spark-submit, the command I use

Re: SparkSQL on Hive error

2014-10-03 Thread Cheng Lian
Also make sure to call |hiveContext.sql| within the same thread where |hiveContext| is created, because Hive uses thread-local variable to initialize the |Driver.conf|. On 10/3/14 4:52 PM, Michael Armbrust wrote: Are you running master? There was briefly a regression here that is hopefully