Hi, I recently tried deploying Spark master and slave instances to container based environments such as Docker, Nomad etc. There are two issues that I've found with how the startup scripts work. The sbin/start-master.sh and sbin/start-slave.sh start a daemon by default, but this isn't as compatible with container deployments as one would think. The first issue is that the daemon runs in the background and some container solutions require the apps to run in the foreground or they consider the application to not be running and they may close down the task. The second issue is that logs don't seem to get integrated with the logging mechanism in the container solution. What is the possibility of adding additional flags or startup scripts for supporting Spark to run in the foreground? It would be great if a flag like SPARK_NO_DAEMONIZE could be added or another script for foreground execution.
Regards, Jeff -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Running-Spark-master-slave-instances-in-non-Daemon-mode-tp27819.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe e-mail: user-unsubscr...@spark.apache.org