Re: Having multiple spark context

2017-01-30 Thread Rohit Verma
, Jasbir <jasbir.s...@accenture.com<mailto:jasbir.s...@accenture.com>>; Mark Hamstra <m...@clearstorydata.com<mailto:m...@clearstorydata.com>> Subject: Re: Having multiple spark context in general in a single JVM which is basically running in Local mode, you have onl

RE: Having multiple spark context

2017-01-30 Thread jasbir.sing
kow...@gmail.com> Cc: Rohit Verma <rohit.ve...@rokittech.com>; user@spark.apache.org; Sing, Jasbir <jasbir.s...@accenture.com>; Mark Hamstra <m...@clearstorydata.com> Subject: Re: Having multiple spark context in general in a single JVM which is basically running in Local mode,

Re: Having multiple spark context

2017-01-30 Thread Mich Talebzadeh
in general in a single JVM which is basically running in Local mode, you have only one Spark Context. However, you can stop the current Spark Context by sc.stop() HTH Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw

Re: Having multiple spark context

2017-01-29 Thread vincent gromakowski
A clustering lib is necessary to manage multiple jvm. Akka cluster for instance Le 30 janv. 2017 8:01 AM, "Rohit Verma" a écrit : > Hi, > > If I am right, you need to launch other context from another jvm. If you > are trying to launch from same jvm another context it

Re: Having multiple spark context

2017-01-29 Thread Rohit Verma
Hi, If I am right, you need to launch other context from another jvm. If you are trying to launch from same jvm another context it will return you the existing context. Rohit On Jan 30, 2017, at 12:24 PM, Mark Hamstra > wrote: More than

Re: Having multiple spark context

2017-01-29 Thread Mark Hamstra
More than one Spark Context in a single Application is not supported. On Sun, Jan 29, 2017 at 9:08 PM, wrote: > Hi, > > > > I have a requirement in which, my application creates one Spark context in > Distributed mode whereas another Spark context in local mode. > >

Having multiple spark context

2017-01-29 Thread jasbir.sing
Hi, I have a requirement in which, my application creates one Spark context in Distributed mode whereas another Spark context in local mode. When I am creating this, my complete application is working on only one SparkContext (created in Distributed mode). Second spark context is not getting