Yes, both things can happen. Take a look at 
http://spark.apache.org/docs/latest/job-scheduling.html, which includes 
scheduling concurrent jobs within the same driver.

Matei

On Apr 15, 2014, at 4:08 PM, Ian Ferreira <ianferre...@hotmail.com> wrote:

> What is the support for multi-tenancy in Spark.
> 
> I assume more than one driver can share the same cluster, but can a driver 
> run two jobs in parallel?
> 

Reply via email to