On 8 Jul 2016 2:03 p.m., "Mazen" <mazen.ezzedd...@gmail.com> wrote:
>
> Does Spark handle simulate nous execution of jobs within an application

Yes. Run as many Spark jobs as you want and Spark will queue them given CPU
and RAM available for you in the cluster.

> job execution is blocking i.e. a new job can not be initiated until the
> previous one commits.

That's how usually people code their apps. The more advanced approach is to
use SparkContext from multiple threads and execute actions (that will
submit jobs).

>  What does it mean that :  "Spark’s scheduler is fully thread-safe"

You can use a single SparkContext from multiple threads.

Jacek

Reply via email to