Hi Akhil, The python wrapper for Spark Job Server did not help me. I actually 
need the pyspark code sample  which shows how  I can call a function from 2 
threads and execute it simultaneously. Thanks & Regards,
Meethu M 


     On Thursday, 14 May 2015 12:38 PM, Akhil Das <ak...@sigmoidanalytics.com> 
wrote:
   

 Did you happened to have a look at the spark job server? Someone wrote a 
python wrapper around it, give it a try.
ThanksBest Regards
On Thu, May 14, 2015 at 11:10 AM, MEETHU MATHEW <meethu2...@yahoo.co.in> wrote:

Hi all,
 Quote "Inside a given Spark application (SparkContext instance), multiple 
parallel jobs can run simultaneously if they were submitted from separate 
threads. " 
How to run multiple jobs in one SPARKCONTEXT using separate threads in pyspark? 
I found some examples in scala and java, but couldn't find python code. Can 
anyone help me with a pyspark example? 
Thanks & Regards,
Meethu M



  

Reply via email to