Re: How can I get the same spark context in two different python processes

2022-12-13 Thread Maciej
                 午2:39寫道: > >                     Spark Connect :) > >                     (It’s work in progress) > > >                     On Mon, Dec 12 2022 at 2:29 PM, Kevin Su >                     mailto:pi

Re: How can I get the same spark context in two different python processes

2022-12-12 Thread Kevin Su
Hi Jack, My use case is a bit different, I created a subprocess instead of thread. I can't pass the args to subprocess. Jack Goodson 於 2022年12月12日 週一 晚上8:03寫道: > apologies, the code should read as below > > from threading import Thread > > context =

Re: How can I get the same spark context in two different python processes

2022-12-12 Thread Jack Goodson
apologies, the code should read as below from threading import Thread context = pyspark.sql.SparkSession.builder.appName("spark").getOrCreate() t1 = Thread(target=my_func, args=(context,)) t1.start() t2 = Thread(target=my_func, args=(context,)) t2.start() On Tue, Dec 13, 2022 at 4:10 PM Jack

Re: How can I get the same spark context in two different python processes

2022-12-12 Thread Jack Goodson
Hi Kevin, I had a similar use case (see below code) but with something that wasn’t spark related. I think the below should work for you, you may need to edit the context variable to suit your needs but hopefully it gives the general idea of sharing a single object between multiple threads.

Re: How can I get the same spark context in two different python processes

2022-12-12 Thread Kevin Su
午2:39寫道: > > > > Spark Connect :) > > > > (It’s work in progress) > > > > > > On Mon, Dec 12 2022 at 2:29 PM, Kevin Su > > mailto:pings...@gmail.com>> > wro

Re: How can I get the same spark context in two different python processes

2022-12-12 Thread Maciej
ks.com>> 於 2022年12月12日 週一 下 午2:39寫道: Spark Connect :) (It’s work in progress) On Mon, Dec 12 2022 at 2:29 PM, Kevin Su mailto:pings...@gmail.com>> wrote: Hey

Re: How can I get the same spark context in two different python processes

2022-12-12 Thread Kevin Su
or Jira ticket for it? >>>> >>>> Reynold Xin 於 2022年12月12日 週一 下午2:39寫道: >>>> >>>>> Spark Connect :) >>>>> >>>>> (It’s work in progress) >>>>> >>>>> >>>>> On Mon, Dec 12 2022 at 2:29

Re: How can I get the same spark context in two different python processes

2022-12-12 Thread bo yang
月12日 週一 下午2:42寫道: >> >>> Thanks for the quick response? Do we have any PR or Jira ticket for it? >>> >>> Reynold Xin 於 2022年12月12日 週一 下午2:39寫道: >>> >>>> Spark Connect :) >>>> >>>> (It’s work in progress) >>>>

Re: How can I get the same spark context in two different python processes

2022-12-12 Thread Kevin Su
gt; Thanks for the quick response? Do we have any PR or Jira ticket for it? >> >> Reynold Xin 於 2022年12月12日 週一 下午2:39寫道: >> >>> Spark Connect :) >>> >>> (It’s work in progress) >>> >>> >>> On Mon, Dec 12 2022 at 2:29 PM, K

Re: How can I get the same spark context in two different python processes

2022-12-12 Thread Reynold Xin
Spark Connect :) (It’s work in progress) On Mon, Dec 12 2022 at 2:29 PM, Kevin Su < pings...@gmail.com > wrote: > > Hey there, How can I get the same spark context in two different python > processes? > Let’s say I create a context in Process A, and then I want to use pyt

How can I get the same spark context in two different python processes

2022-12-12 Thread Kevin Su
Hey there, How can I get the same spark context in two different python processes? Let’s say I create a context in Process A, and then I want to use python subprocess B to get the spark context created by Process A. How can I achieve that? I've tried pyspark.sql.SparkSession.builder.appName