Thanks Mandar.
Our need is to get sql queries from client and submit over spark cluster. We
don't want application to get submitted for each query. We want executors to
get shared across multiple queries as we would cache rdds which would get
used across queries.
If I am correct, spark context
Hi Team,
I am new to spark and writing my first program. I have written sample
program with spark master as local. To execute spark over local yarn what
should be value of spark.master property? Can I point to remote yarn
cluster? I would like to execute this as a java application and not