On Tue, Dec 1, 2015 at 9:43 PM, Anfernee Xu <anfernee...@gmail.com> wrote:
> But I have a single server(JVM) that is creating SparkContext, are you
> saying Spark supports multiple SparkContext in the same JVM? Could you
> please clarify on this?

I'm confused. Nothing you said so far requires multiple contexts. From
your original message:

> I have a long running backend server where I will create a short-lived Spark 
> job

You can have a single SparkContext and submit multiple jobs to it. And
that works regardless of cluster manager or deploy mode.

-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to