There is also sbin/star-all.sh and sbin/stop-all.sh which enables you to
star/stop master and workers all together

On Sunday, March 20, 2016, Akhil Das <ak...@sigmoidanalytics.com> wrote:

> You can simply execute the sbin/start-slaves.sh file to start up all slave
> processes. Just make sure you have spark installed on the same path on all
> the machines.
>
> Thanks
> Best Regards
>
> On Sat, Mar 19, 2016 at 4:01 AM, Ashok Kumar <ashok34...@yahoo.com.invalid
> <javascript:_e(%7B%7D,'cvml','ashok34...@yahoo.com.invalid');>> wrote:
>
>> Experts.
>>
>> Please your valued advice.
>>
>> I have spark 1.5.2 set up as standalone for now and I have started the
>> master as below
>>
>> start-master.sh
>>
>> I also have modified config/slave file to have
>>
>> # A Spark Worker will be started on each of the machines listed below.
>> localhost
>> workerhost
>>
>>
>> On the localhost I start slave as follows:
>>
>> start-slave.sh spark:localhost:7077
>>
>> Questions.
>>
>> If I want worker process to be started not only on localhost but also
>> workerhost
>>
>> 1) Do I need just to do start-slave.sh on localhost and it will start
>> the worker process on other node -> workerhost
>> 2) Do I have to runt start-slave.sh spark:workerhost:7077 as well locally
>> on workerhost
>> 3) On GUI http:// 
>> <http://rhes564:4040/environment/>localhost:4040/environment/
>> I do not see any reference to worker process running on workerhost
>>
>> Appreciate any help on how to go about starting the master on localhost
>> and starting two workers one on localhost and the other on workerhost
>>
>> Thanking you
>>
>>
>

-- 
Sent from my Mobile device

Reply via email to