If you are running in Local mode, then you can submit many jobs. As long as
your hardware has resources to do multiple jobs there won't be any
dependency. in other words each app (spark-submit) will run in its own JVM
unaware of others. Local mode is good for testing.
HTH
Dr Mich Talebzadeh
On 8 Jul 2016 2:03 p.m., "Mazen" wrote:
>
> Does Spark handle simulate nous execution of jobs within an application
Yes. Run as many Spark jobs as you want and Spark will queue them given CPU
and RAM available for you in the cluster.
> job execution is blocking i.e. a
ext:
http://apache-spark-user-list.1001560.n3.nabble.com/Simultaneous-spark-Jobs-execution-tp27310.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org