Re: Spark SQL check if query is completed (pyspark)

2014-09-08 Thread jamborta
thank you for the replies.

I am running an insert on a join (INSERT OVERWRITE TABLE new_table select *
from table1 as a join table2 as b on (a.key = b.key), 

The process does not have the right permission to write to that folder, so I
get the following error printed:
chgrp: `/user/x/y': No such file or directory
chmod: `/user/x/y': No such file or directory
and it returns an empty RDD without getting an exception.

thanks,




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-check-if-query-is-completed-pyspark-tp13630p13685.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Spark SQL check if query is completed (pyspark)

2014-09-08 Thread Michael Armbrust
You are probably not getting an error because the exception is happening
inside of Hive.  I'd still consider this a bug if you'd like to open a JIRA.

On Mon, Sep 8, 2014 at 3:02 AM, jamborta jambo...@gmail.com wrote:

 thank you for the replies.

 I am running an insert on a join (INSERT OVERWRITE TABLE new_table select *
 from table1 as a join table2 as b on (a.key = b.key),

 The process does not have the right permission to write to that folder, so
 I
 get the following error printed:
 chgrp: `/user/x/y': No such file or directory
 chmod: `/user/x/y': No such file or directory
 and it returns an empty RDD without getting an exception.

 thanks,




 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-check-if-query-is-completed-pyspark-tp13630p13685.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




Re: Spark SQL check if query is completed (pyspark)

2014-09-07 Thread Michael Armbrust
Sometimes the underlying Hive code will also print exceptions during
successful execution (for example CREATE TABLE IF NOT EXISTS).  If there is
actually a problem Spark SQL should throw an exception.

What is the command you are running and what is the error you are seeing?


On Sat, Sep 6, 2014 at 2:11 PM, Davies Liu dav...@databricks.com wrote:

 The SQLContext.sql() will return an SchemaRDD, you need to call collect()
 to pull the data in.

 On Sat, Sep 6, 2014 at 6:02 AM, jamborta jambo...@gmail.com wrote:
  Hi,
 
  I am using Spark SQL to run some administrative queries and joins (e.g.
  create table, insert overwrite, etc), where the query does not return any
  data. I noticed if the query fails it prints some error message on the
  console, but does not actually throw an exception (this is spark 1.0.2).
 
  Is there any way to get these errors from the returned object?
 
  thanks,
 
 
 
  --
  View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-check-if-query-is-completed-pyspark-tp13630.html
  Sent from the Apache Spark User List mailing list archive at Nabble.com.
 
  -
  To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
  For additional commands, e-mail: user-h...@spark.apache.org
 

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




Spark SQL check if query is completed (pyspark)

2014-09-06 Thread jamborta
Hi,

I am using Spark SQL to run some administrative queries and joins (e.g.
create table, insert overwrite, etc), where the query does not return any
data. I noticed if the query fails it prints some error message on the
console, but does not actually throw an exception (this is spark 1.0.2). 

Is there any way to get these errors from the returned object?

thanks,



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-check-if-query-is-completed-pyspark-tp13630.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Spark SQL check if query is completed (pyspark)

2014-09-06 Thread Davies Liu
The SQLContext.sql() will return an SchemaRDD, you need to call collect()
to pull the data in.

On Sat, Sep 6, 2014 at 6:02 AM, jamborta jambo...@gmail.com wrote:
 Hi,

 I am using Spark SQL to run some administrative queries and joins (e.g.
 create table, insert overwrite, etc), where the query does not return any
 data. I noticed if the query fails it prints some error message on the
 console, but does not actually throw an exception (this is spark 1.0.2).

 Is there any way to get these errors from the returned object?

 thanks,



 --
 View this message in context: 
 http://apache-spark-user-list.1001560.n3.nabble.com/Spark-SQL-check-if-query-is-completed-pyspark-tp13630.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org


-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org