Re: Managing spark processes via supervisord

2015-06-05 Thread ayan guha
I use a simple python to launch cluster. I just did itfor fun, so of course
not the best and lot ofmodifications can be done.But I think you arelooking
for something similar?

import subprocess as s
from time import sleep
cmd =
D:\\spark\\spark-1.3.1-bin-hadoop2.6\\spark-1.3.1-bin-hadoop2.6\\spark-1.3.1-bin-hadoop2.6\\bin\\spark-class.cmd

master = org.apache.spark.deploy.master.Master
worker = org.apache.spark.deploy.worker.Worker
masterUrl=spark://BigData:7077
cmds={masters:1,workers:3}

masterProcess=[cmd,master]
workerProcess=[cmd,worker,masterUrl]

noWorker = 3

pMaster = s.Popen(masterProcess)
sleep(3)

pWorkers = []
for i in range(noWorker):
pw = s.Popen(workerProcess)
pWorkers.append(pw)



On Sat, Jun 6, 2015 at 8:19 AM, Mike Trienis mike.trie...@orcsol.com
wrote:

 Thanks Ignor,

 I managed to find a fairly simple solution. It seems that the shell
 scripts (e.g. .start-master.sh, start-slave.sh) end up executing
 /bin/spark-class which is always run in the foreground.

 Here is a solution I provided on stackoverflow:

-

 http://stackoverflow.com/questions/30672648/how-to-autostart-an-apache-spark-cluster-using-supervisord/30676844#30676844


 Cheers Mike


 On Wed, Jun 3, 2015 at 12:29 PM, Igor Berman igor.ber...@gmail.com
 wrote:

 assuming you are talking about standalone cluster
 imho, with workers you won't get any problems and it's straightforward
 since they are usually foreground processes
 with master it's a bit more complicated, ./sbin/start-master.sh goes
 background which is not good for supervisor, but anyway I think it's
 doable(going to setup it too in a few days)

 On 3 June 2015 at 21:46, Mike Trienis mike.trie...@orcsol.com wrote:

 Hi All,

 I am curious to know if anyone has successfully deployed a spark cluster
 using supervisord?

- http://supervisord.org/

 Currently I am using the cluster launch scripts which are working
 greater, however, every time I reboot my VM or development environment I
 need to re-launch the cluster.

 I am considering using supervisord to control all the processes (worker,
 master, ect.. ) in order to have the cluster up an running after boot-up;
 although I'd like to understand if it will cause more issues than it
 solves.

 Thanks, Mike.






-- 
Best Regards,
Ayan Guha


Re: Managing spark processes via supervisord

2015-06-05 Thread Mike Trienis
Thanks Ignor,

I managed to find a fairly simple solution. It seems that the shell scripts
(e.g. .start-master.sh, start-slave.sh) end up executing /bin/spark-class
which is always run in the foreground.

Here is a solution I provided on stackoverflow:

   -
   
http://stackoverflow.com/questions/30672648/how-to-autostart-an-apache-spark-cluster-using-supervisord/30676844#30676844


Cheers Mike


On Wed, Jun 3, 2015 at 12:29 PM, Igor Berman igor.ber...@gmail.com wrote:

 assuming you are talking about standalone cluster
 imho, with workers you won't get any problems and it's straightforward
 since they are usually foreground processes
 with master it's a bit more complicated, ./sbin/start-master.sh goes
 background which is not good for supervisor, but anyway I think it's
 doable(going to setup it too in a few days)

 On 3 June 2015 at 21:46, Mike Trienis mike.trie...@orcsol.com wrote:

 Hi All,

 I am curious to know if anyone has successfully deployed a spark cluster
 using supervisord?

- http://supervisord.org/

 Currently I am using the cluster launch scripts which are working
 greater, however, every time I reboot my VM or development environment I
 need to re-launch the cluster.

 I am considering using supervisord to control all the processes (worker,
 master, ect.. ) in order to have the cluster up an running after boot-up;
 although I'd like to understand if it will cause more issues than it
 solves.

 Thanks, Mike.





Re: Managing spark processes via supervisord

2015-06-03 Thread Igor Berman
assuming you are talking about standalone cluster
imho, with workers you won't get any problems and it's straightforward
since they are usually foreground processes
with master it's a bit more complicated, ./sbin/start-master.sh goes
background which is not good for supervisor, but anyway I think it's
doable(going to setup it too in a few days)

On 3 June 2015 at 21:46, Mike Trienis mike.trie...@orcsol.com wrote:

 Hi All,

 I am curious to know if anyone has successfully deployed a spark cluster
 using supervisord?

- http://supervisord.org/

 Currently I am using the cluster launch scripts which are working greater,
 however, every time I reboot my VM or development environment I need to
 re-launch the cluster.

 I am considering using supervisord to control all the processes (worker,
 master, ect.. ) in order to have the cluster up an running after boot-up;
 although I'd like to understand if it will cause more issues than it
 solves.

 Thanks, Mike.