What about start-all.sh or start-slaves.sh?

Thanks
Best Regards

On Tue, Oct 21, 2014 at 10:25 AM, Soumya Simanta <soumya.sima...@gmail.com>
wrote:

> I'm working a cluster where I need to start the workers separately and
> connect them to a master.
>
> I'm following the instructions here and using branch-1.1
>
> http://spark.apache.org/docs/latest/spark-standalone.html#starting-a-cluster-manually
>
> and I can start the master using
> ./sbin/start-master.sh
>
> When I try to start the slave/worker using
> ./sbin/start-slave.sh it does't work. The logs say that it needs the
> master.
> when I provide
> ./sbin/start-slave.sh spark://<master-ip>:7077 it still doesn't work.
>
> I can start the worker using the following command (as described in the
> documentation).
>
> ./bin/spark-class org.apache.spark.deploy.worker.Worker spark://IP:PORT
>
> Was wondering why start-slave.sh is not working?
>
> Thanks
> -Soumya
>
>

Reply via email to