I think this is a general multiple-threading question, Queue is the
right direction to go.

Have  you try something like this?

results = Queue.Queue()

def run_job(f, args):
      r = f(*args)
      results.put(r)

# start multiple threads to run jobs
threading.Thread(target=run_job, args=(f, args,)).start()

while True:
    r = results.get()
    print r


On Wed, May 20, 2015 at 5:56 AM, MEETHU MATHEW <meethu2...@yahoo.co.in> wrote:
> Hi Davies,
> Thank you for pointing to spark streaming.
> I am confused about how to return the result after running a function via  a
> thread.
> I tried using Queue to add the results to it and print it at the end.But
> here, I can see the results after all threads are finished.
> How to get the result of the function once a thread is finished, rather than
> waiting for all other threads to finish?
>
> Thanks & Regards,
> Meethu M
>
>
>
> On Tuesday, 19 May 2015 2:43 AM, Davies Liu <dav...@databricks.com> wrote:
>
>
> SparkContext can be used in multiple threads (Spark streaming works
> with multiple threads), for example:
>
> import threading
> import time
>
> def show(x):
>     time.sleep(1)
>     print x
>
> def job():
>     sc.parallelize(range(100)).foreach(show)
>
> threading.Thread(target=job).start()
>
>
> On Mon, May 18, 2015 at 12:34 AM, ayan guha <guha.a...@gmail.com> wrote:
>> Hi
>>
>> So to be clear, do you want to run one operation in multiple threads
>> within
>> a function or you want run multiple jobs using multiple threads? I am
>> wondering why python thread module can't be used? Or you have already gave
>> it a try?
>>
>> On 18 May 2015 16:39, "MEETHU MATHEW" <meethu2...@yahoo.co.in> wrote:
>>>
>>> Hi Akhil,
>>>
>>> The python wrapper for Spark Job Server did not help me. I actually need
>>> the pyspark code sample  which shows how  I can call a function from 2
>>> threads and execute it simultaneously.
>>>
>>> Thanks & Regards,
>>> Meethu M
>>>
>>>
>>>
>>> On Thursday, 14 May 2015 12:38 PM, Akhil Das <ak...@sigmoidanalytics.com>
>>> wrote:
>>>
>>>
>>> Did you happened to have a look at the spark job server? Someone wrote a
>>> python wrapper around it, give it a try.
>>>
>>> Thanks
>>> Best Regards
>>>
>>> On Thu, May 14, 2015 at 11:10 AM, MEETHU MATHEW <meethu2...@yahoo.co.in>
>>> wrote:
>>>
>>> Hi all,
>>>
>>>  Quote
>>>  "Inside a given Spark application (SparkContext instance), multiple
>>> parallel jobs can run simultaneously if they were submitted from separate
>>> threads. "
>>>
>>> How to run multiple jobs in one SPARKCONTEXT using separate threads in
>>> pyspark? I found some examples in scala and java, but couldn't find
>>> python
>>> code. Can anyone help me with a pyspark example?
>>>
>>> Thanks & Regards,
>>> Meethu M
>
>>>
>>>
>>>
>>>
>>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to