If you run the main driver and other Spark jobs in client mode, you can make
sure they (I meant all the drivers) are running at the same node. Of course
all drivers now consume the resources at the same node.

If you run the main driver in client mode, but run other Spark jobs in
cluster mode, the drivers of those Spark jobs will be launched at other
nodes in the cluster. It should work too. It is as same as you run a Spark
app in client mode and more others in cluster mode.

If you run your main driver in cluster mode, and run other Spark jobs in
cluster mode too, you may need  Spark properly installed in all nodes in the
cluster, because those Spark jobs will be launched at the node which the
main driver is running on.





-----
Liang-Chi Hsieh | @viirya 
Spark Technology Center 
http://www.spark.tc/ 
--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/Launching-multiple-spark-jobs-within-a-main-spark-job-tp20311p20327.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to