Did you happened to have a look at the spark job server?
<https://github.com/ooyala/spark-jobserver> Someone wrote a python wrapper
<https://github.com/wangqiang8511/spark_job_manager> around it, give it a
try.

Thanks
Best Regards

On Thu, May 14, 2015 at 11:10 AM, MEETHU MATHEW <[email protected]>
wrote:

> Hi all,
>
>  Quote
>  "Inside a given Spark application (SparkContext instance), multiple
> parallel jobs can run simultaneously if they were submitted from separate
> threads. "
>
> How to run multiple jobs in one SPARKCONTEXT using separate threads in
> pyspark? I found some examples in scala and java, but couldn't find python
> code. Can anyone help me with a* pyspark example*?
>
> Thanks & Regards,
> Meethu M
>

Reply via email to