Setup Spark jobserver for Spark SQL

2015-04-02 Thread Harika
Hi,

I am trying to Spark Jobserver(
https://github.com/spark-jobserver/spark-jobserver
https://github.com/spark-jobserver/spark-jobserver  ) for running Spark
SQL jobs.

I was able to start the server but when I run my application(my Scala class
which extends SparkSqlJob), I am getting the following as response:

{
  status: ERROR,
  result: Invalid job type for this context
}

Can any one suggest me what is going wrong or provide a detailed procedure
for setting up jobserver for SparkSQL? 



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Setup-Spark-jobserver-for-Spark-SQL-tp22352.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Setup Spark jobserver for Spark SQL

2015-04-02 Thread Daniel Siegmann
You shouldn't need to do anything special. Are you using a named context?
I'm not sure those work with SparkSqlJob.

By the way, there is a forum on Google groups for the Spark Job Server:
https://groups.google.com/forum/#!forum/spark-jobserver

On Thu, Apr 2, 2015 at 5:10 AM, Harika matha.har...@gmail.com wrote:

 Hi,

 I am trying to Spark Jobserver(
 https://github.com/spark-jobserver/spark-jobserver
 https://github.com/spark-jobserver/spark-jobserver  ) for running Spark
 SQL jobs.

 I was able to start the server but when I run my application(my Scala class
 which extends SparkSqlJob), I am getting the following as response:

 {
   status: ERROR,
   result: Invalid job type for this context
 }

 Can any one suggest me what is going wrong or provide a detailed procedure
 for setting up jobserver for SparkSQL?



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Setup-Spark-jobserver-for-Spark-SQL-tp22352.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org