Re: Simultaneous spark Jobs execution.

2016-07-08 Thread Mich Talebzadeh
If you are running in Local mode, then you can submit many jobs. As long as
your hardware has resources to do multiple jobs there won't be any
dependency. in other words each app (spark-submit) will run in its own JVM
unaware of others. Local mode is good for testing.

HTH



Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.



On 8 July 2016 at 14:09, Jacek Laskowski  wrote:

> On 8 Jul 2016 2:03 p.m., "Mazen"  wrote:
> >
> > Does Spark handle simulate nous execution of jobs within an application
>
> Yes. Run as many Spark jobs as you want and Spark will queue them given
> CPU and RAM available for you in the cluster.
>
> > job execution is blocking i.e. a new job can not be initiated until the
> > previous one commits.
>
> That's how usually people code their apps. The more advanced approach is
> to use SparkContext from multiple threads and execute actions (that will
> submit jobs).
>
> >  What does it mean that :  "Spark’s scheduler is fully thread-safe"
>
> You can use a single SparkContext from multiple threads.
>
> Jacek
>


Re: Simultaneous spark Jobs execution.

2016-07-08 Thread Jacek Laskowski
On 8 Jul 2016 2:03 p.m., "Mazen"  wrote:
>
> Does Spark handle simulate nous execution of jobs within an application

Yes. Run as many Spark jobs as you want and Spark will queue them given CPU
and RAM available for you in the cluster.

> job execution is blocking i.e. a new job can not be initiated until the
> previous one commits.

That's how usually people code their apps. The more advanced approach is to
use SparkContext from multiple threads and execute actions (that will
submit jobs).

>  What does it mean that :  "Spark’s scheduler is fully thread-safe"

You can use a single SparkContext from multiple threads.

Jacek