Thanks Ignor,

I managed to find a fairly simple solution. It seems that the shell scripts
(e.g. .start-master.sh, start-slave.sh) end up executing /bin/spark-class
which is always run in the foreground.

Here is a solution I provided on stackoverflow:

   -
   
http://stackoverflow.com/questions/30672648/how-to-autostart-an-apache-spark-cluster-using-supervisord/30676844#30676844


Cheers Mike


On Wed, Jun 3, 2015 at 12:29 PM, Igor Berman <igor.ber...@gmail.com> wrote:

> assuming you are talking about standalone cluster
> imho, with workers you won't get any problems and it's straightforward
> since they are usually foreground processes
> with master it's a bit more complicated, ./sbin/start-master.sh goes
> background which is not good for supervisor, but anyway I think it's
> doable(going to setup it too in a few days)
>
> On 3 June 2015 at 21:46, Mike Trienis <mike.trie...@orcsol.com> wrote:
>
>> Hi All,
>>
>> I am curious to know if anyone has successfully deployed a spark cluster
>> using supervisord?
>>
>>    - http://supervisord.org/
>>
>> Currently I am using the cluster launch scripts which are working
>> greater, however, every time I reboot my VM or development environment I
>> need to re-launch the cluster.
>>
>> I am considering using supervisord to control all the processes (worker,
>> master, ect.. ) in order to have the cluster up an running after boot-up;
>> although I'd like to understand if it will cause more issues than it
>> solves.
>>
>> Thanks, Mike.
>>
>>
>

Reply via email to