Hi Davies,Thank you for pointing to spark streaming. I am confused about how to
return the result after running a function via a thread.I tried using Queue to
add the results to it and print it at the end.But here, I can see the results
after all threads are finished.How to get the result of the function once a
thread is finished, rather than waiting for all other threads to finish? Thanks
& Regards,
Meethu M
On Tuesday, 19 May 2015 2:43 AM, Davies Liu <[email protected]> wrote:
SparkContext can be used in multiple threads (Spark streaming works
with multiple threads), for example:
import threading
import time
def show(x):
time.sleep(1)
print x
def job():
sc.parallelize(range(100)).foreach(show)
threading.Thread(target=job).start()
On Mon, May 18, 2015 at 12:34 AM, ayan guha <[email protected]> wrote:
> Hi
>
> So to be clear, do you want to run one operation in multiple threads within
> a function or you want run multiple jobs using multiple threads? I am
> wondering why python thread module can't be used? Or you have already gave
> it a try?
>
> On 18 May 2015 16:39, "MEETHU MATHEW" <[email protected]> wrote:
>>
>> Hi Akhil,
>>
>> The python wrapper for Spark Job Server did not help me. I actually need
>> the pyspark code sample which shows how I can call a function from 2
>> threads and execute it simultaneously.
>>
>> Thanks & Regards,
>> Meethu M
>>
>>
>>
>> On Thursday, 14 May 2015 12:38 PM, Akhil Das <[email protected]>
>> wrote:
>>
>>
>> Did you happened to have a look at the spark job server? Someone wrote a
>> python wrapper around it, give it a try.
>>
>> Thanks
>> Best Regards
>>
>> On Thu, May 14, 2015 at 11:10 AM, MEETHU MATHEW <[email protected]>
>> wrote:
>>
>> Hi all,
>>
>> Quote
>> "Inside a given Spark application (SparkContext instance), multiple
>> parallel jobs can run simultaneously if they were submitted from separate
>> threads. "
>>
>> How to run multiple jobs in one SPARKCONTEXT using separate threads in
>> pyspark? I found some examples in scala and java, but couldn't find python
>> code. Can anyone help me with a pyspark example?
>>
>> Thanks & Regards,
>> Meethu M
>>
>>
>>
>>
>
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]