Thanks Mandar.

Our need is to get sql queries from client and submit over spark cluster. We
don't want application to get submitted for each query. We want executors to
get shared across multiple queries as we would cache rdds which would get
used across queries. 

If I am correct, spark context corresponds to an application. And
application can be used in interactive mode too. So was thinking to create
server which will have spark context pointing to yarn cluster and use this
context to run multiple queries over period. 

Can you suggest way to achieve the requirement of interactive queries and
reuse of executors across queries.

Thanks,
Atul



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Running-spark-Java-on-yarn-cluster-tp27504p27507.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to