Hi Zhan,

I'm illustrating the issue via a simple example. However it is not
difficult to imagine use cases that need this behaviour. For example, you
want to release all resources of spark when it does not use for longer than
an hour in  a job server like web services. Unless you can prevent people
from stopping spark context, then it is reasonable to assume that people
can stop it and start it again in  later time.

Best Regards,

Jerry


On Mon, Dec 21, 2015 at 7:20 PM, Zhan Zhang <zzh...@hortonworks.com> wrote:

> This looks to me is a very unusual use case. You stop the SparkContext,
> and start another one. I don’t think it is well supported. As the
> SparkContext is stopped, all the resources are supposed to be released.
>
> Is there any mandatory reason you have stop and restart another
> SparkContext.
>
> Thanks.
>
> Zhan Zhang
>
> Note that when sc is stopped, all resources are released (for example in
> yarn
> On Dec 20, 2015, at 2:59 PM, Jerry Lam <chiling...@gmail.com> wrote:
>
> > Hi Spark developers,
> >
> > I found that SQLContext.getOrCreate(sc: SparkContext) does not behave
> correctly when a different spark context is provided.
> >
> > ```
> > val sc = new SparkContext
> > val sqlContext =SQLContext.getOrCreate(sc)
> > sc.stop
> > ...
> >
> > val sc2 = new SparkContext
> > val sqlContext2 = SQLContext.getOrCreate(sc2)
> > sc2.stop
> > ```
> >
> > The sqlContext2 will reference sc instead of sc2 and therefore, the
> program will not work because sc has been stopped.
> >
> > Best Regards,
> >
> > Jerry
>
>

Reply via email to