Haoming If the Spark UI states that one of the jobs is in the "Waiting" state, this is a resources issue. You will need to set properties such as:
spark.executor.memory spark.cores.max Set these so that each instance only takes a portion of the available worker memory and cores. Regards, Mike -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Submiting-multiple-jobs-via-different-threads-tp7948p20669.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org