, Jasbir
<jasbir.s...@accenture.com<mailto:jasbir.s...@accenture.com>>; Mark Hamstra
<m...@clearstorydata.com<mailto:m...@clearstorydata.com>>
Subject: Re: Having multiple spark context
in general in a single JVM which is basically running in Local mode, you have
onl
kow...@gmail.com>
Cc: Rohit Verma <rohit.ve...@rokittech.com>; user@spark.apache.org; Sing,
Jasbir <jasbir.s...@accenture.com>; Mark Hamstra <m...@clearstorydata.com>
Subject: Re: Having multiple spark context
in general in a single JVM which is basically running in Local mode,
in general in a single JVM which is basically running in Local mode, you
have only one Spark Context. However, you can stop the current Spark
Context by
sc.stop()
HTH
Dr Mich Talebzadeh
LinkedIn *
https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
A clustering lib is necessary to manage multiple jvm. Akka cluster for
instance
Le 30 janv. 2017 8:01 AM, "Rohit Verma" a
écrit :
> Hi,
>
> If I am right, you need to launch other context from another jvm. If you
> are trying to launch from same jvm another context it
Hi,
If I am right, you need to launch other context from another jvm. If you are
trying to launch from same jvm another context it will return you the existing
context.
Rohit
On Jan 30, 2017, at 12:24 PM, Mark Hamstra
> wrote:
More than
More than one Spark Context in a single Application is not supported.
On Sun, Jan 29, 2017 at 9:08 PM, wrote:
> Hi,
>
>
>
> I have a requirement in which, my application creates one Spark context in
> Distributed mode whereas another Spark context in local mode.
>
>
Hi,
I have a requirement in which, my application creates one Spark context in
Distributed mode whereas another Spark context in local mode.
When I am creating this, my complete application is working on only one
SparkContext (created in Distributed mode). Second spark context is not getting