Hi all, Quote "Inside a given Spark application (SparkContext instance), multiple parallel jobs can run simultaneously if they were submitted from separate threads. " How to run multiple jobs in one SPARKCONTEXT using separate threads in pyspark? I found some examples in scala and java, but couldn't find python code. Can anyone help me with a pyspark example? Thanks & Regards, Meethu M
- How to run multiple jobs in one sparkcontext from separate t... MEETHU MATHEW
- Re: How to run multiple jobs in one sparkcontext from s... Akhil Das
- Re: How to run multiple jobs in one sparkcontext fr... MEETHU MATHEW
- Re: How to run multiple jobs in one sparkcontex... ayan guha
- Re: How to run multiple jobs in one sparkco... Davies Liu
- Re: How to run multiple jobs in one sp... MEETHU MATHEW
- Re: How to run multiple jobs in on... Davies Liu