Re: Spark SQL check if query is completed (pyspark)

2014-09-08 Thread jamborta
thank you for the replies. I am running an insert on a join (INSERT OVERWRITE TABLE new_table select * from table1 as a join table2 as b on (a.key = b.key), The process does not have the right permission to write to that folder, so I get the following error printed: chgrp: `/user/x/y': No such

Re: Spark SQL check if query is completed (pyspark)

2014-09-08 Thread Michael Armbrust
You are probably not getting an error because the exception is happening inside of Hive. I'd still consider this a bug if you'd like to open a JIRA. On Mon, Sep 8, 2014 at 3:02 AM, jamborta jambo...@gmail.com wrote: thank you for the replies. I am running an insert on a join (INSERT

Re: Spark SQL check if query is completed (pyspark)

2014-09-07 Thread Michael Armbrust
Sometimes the underlying Hive code will also print exceptions during successful execution (for example CREATE TABLE IF NOT EXISTS). If there is actually a problem Spark SQL should throw an exception. What is the command you are running and what is the error you are seeing? On Sat, Sep 6, 2014

Re: Spark SQL check if query is completed (pyspark)

2014-09-06 Thread Davies Liu
The SQLContext.sql() will return an SchemaRDD, you need to call collect() to pull the data in. On Sat, Sep 6, 2014 at 6:02 AM, jamborta jambo...@gmail.com wrote: Hi, I am using Spark SQL to run some administrative queries and joins (e.g. create table, insert overwrite, etc), where the query