You've still got me confused.  The SparkContext exists at the Driver, not
on an Executor.

Many Jobs can be run by a SparkContext -- it is a common pattern to use
something like the Spark Jobserver where all Jobs are run through a shared
SparkContext.

On Sun, Jan 17, 2016 at 12:57 PM, Jia Zou <jacqueline...@gmail.com> wrote:

> Hi, Mark, sorry, I mean SparkContext.
> I mean to change Spark into running all submitted jobs (SparkContexts) in
> one executor JVM.
>
> Best Regards,
> Jia
>
> On Sun, Jan 17, 2016 at 2:21 PM, Mark Hamstra <m...@clearstorydata.com>
> wrote:
>
>> -dev
>>
>> What do you mean by JobContext?  That is a Hadoop mapreduce concept, not
>> Spark.
>>
>> On Sun, Jan 17, 2016 at 7:29 AM, Jia Zou <jacqueline...@gmail.com> wrote:
>>
>>> Dear all,
>>>
>>> Is there a way to reuse executor JVM across different JobContexts?
>>> Thanks.
>>>
>>> Best Regards,
>>> Jia
>>>
>>
>>
>

Reply via email to