Re: Setting up spark to run on two nodes

2016-03-21 Thread Luciano Resende
There is also sbin/star-all.sh and sbin/stop-all.sh which enables you to
star/stop master and workers all together

On Sunday, March 20, 2016, Akhil Das  wrote:

> You can simply execute the sbin/start-slaves.sh file to start up all slave
> processes. Just make sure you have spark installed on the same path on all
> the machines.
>
> Thanks
> Best Regards
>
> On Sat, Mar 19, 2016 at 4:01 AM, Ashok Kumar  > wrote:
>
>> Experts.
>>
>> Please your valued advice.
>>
>> I have spark 1.5.2 set up as standalone for now and I have started the
>> master as below
>>
>> start-master.sh
>>
>> I also have modified config/slave file to have
>>
>> # A Spark Worker will be started on each of the machines listed below.
>> localhost
>> workerhost
>>
>>
>> On the localhost I start slave as follows:
>>
>> start-slave.sh spark:localhost:7077
>>
>> Questions.
>>
>> If I want worker process to be started not only on localhost but also
>> workerhost
>>
>> 1) Do I need just to do start-slave.sh on localhost and it will start
>> the worker process on other node -> workerhost
>> 2) Do I have to runt start-slave.sh spark:workerhost:7077 as well locally
>> on workerhost
>> 3) On GUI http:// 
>> localhost:4040/environment/
>> I do not see any reference to worker process running on workerhost
>>
>> Appreciate any help on how to go about starting the master on localhost
>> and starting two workers one on localhost and the other on workerhost
>>
>> Thanking you
>>
>>
>

-- 
Sent from my Mobile device


Re: Setting up spark to run on two nodes

2016-03-21 Thread Akhil Das
You can simply execute the sbin/start-slaves.sh file to start up all slave
processes. Just make sure you have spark installed on the same path on all
the machines.

Thanks
Best Regards

On Sat, Mar 19, 2016 at 4:01 AM, Ashok Kumar 
wrote:

> Experts.
>
> Please your valued advice.
>
> I have spark 1.5.2 set up as standalone for now and I have started the
> master as below
>
> start-master.sh
>
> I also have modified config/slave file to have
>
> # A Spark Worker will be started on each of the machines listed below.
> localhost
> workerhost
>
>
> On the localhost I start slave as follows:
>
> start-slave.sh spark:localhost:7077
>
> Questions.
>
> If I want worker process to be started not only on localhost but also
> workerhost
>
> 1) Do I need just to do start-slave.sh on localhost and it will start the
> worker process on other node -> workerhost
> 2) Do I have to runt start-slave.sh spark:workerhost:7077 as well locally
> on workerhost
> 3) On GUI http:// 
> localhost:4040/environment/
> I do not see any reference to worker process running on workerhost
>
> Appreciate any help on how to go about starting the master on localhost
> and starting two workers one on localhost and the other on workerhost
>
> Thanking you
>
>


Setting up spark to run on two nodes

2016-03-19 Thread Ashok Kumar
Experts.
Please your valued advice.
I have spark 1.5.2 set up as standalone for now and I have started the master 
as below
start-master.sh

I also have modified config/slave file to have 
# A Spark Worker will be started on each of the machines listed below.
localhostworkerhost

On the localhost I start slave as follows:
start-slave.shspark:localhost:7077 

Questions.
If I want worker process to be started not only on localhost but also workerhost
1) Do I need just to do start-slave.sh on localhost and it will start the 
worker process on other node -> workerhost2) Do I have to runt start-slave.sh 
spark:workerhost:7077 as well locally on workerhost3) On GUI 
http://localhost:4040/environment/ I do not see any reference to worker process 
running on workerhost
Appreciate any help on how to go about starting the master on localhost and 
starting two workers one on localhost and the other on workerhost
Thanking you