Re: Unable to run hive queries inside spark

2015-02-27 Thread sandeep vura
Hi Kundan, Sorry even i am also facing the similar issue today.How did you resolve this issue? Regards, Sandeep.v On Thu, Feb 26, 2015 at 2:25 AM, Michael Armbrust mich...@databricks.com wrote: It looks like that is getting interpreted as a local path. Are you missing a core-site.xml file

Re: Unable to run hive queries inside spark

2015-02-25 Thread Michael Armbrust
It looks like that is getting interpreted as a local path. Are you missing a core-site.xml file to configure hdfs? On Tue, Feb 24, 2015 at 10:40 PM, kundan kumar iitr.kun...@gmail.com wrote: Hi Denny, yes the user has all the rights to HDFS. I am running all the spark operations with this

Re: Unable to run hive queries inside spark

2015-02-24 Thread kundan kumar
Hi Denny, yes the user has all the rights to HDFS. I am running all the spark operations with this user. and my hive-site.xml looks like this property namehive.metastore.warehouse.dir/name value/user/hive/warehouse/value descriptionlocation of default database for the

Re: Unable to run hive queries inside spark

2015-02-24 Thread Denny Lee
The error message you have is: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:file:/user/hive/warehouse/src is not a directory or unable to create one) Could you verify that you (the user you are running under) has the rights to create

Re: Unable to run hive queries inside spark

2015-02-24 Thread Denny Lee
That's all you should need to do. Saying this, I did run into an issue similar to this when I was switching Spark versions which were tied to different default Hive versions (eg Spark 1.3 by default works with Hive 0.13.1). I'm wondering if you may be hitting this issue due to that? On Tue, Feb

Unable to run hive queries inside spark

2015-02-24 Thread kundan kumar
Hi , I have placed my hive-site.xml inside spark/conf and i am trying to execute some hive queries given in the documentation. Can you please suggest what wrong am I doing here. scala val hiveContext = new org.apache.spark.sql.hive.HiveContext(sc) hiveContext: