Re: Setting up spark to run on two nodes

2016-03-21 Thread Luciano Resende
There is also sbin/star-all.sh and sbin/stop-all.sh which enables you to star/stop master and workers all together On Sunday, March 20, 2016, Akhil Das wrote: > You can simply execute the sbin/start-slaves.sh file to start up all slave > processes. Just make sure you

Re: Setting up spark to run on two nodes

2016-03-21 Thread Akhil Das
You can simply execute the sbin/start-slaves.sh file to start up all slave processes. Just make sure you have spark installed on the same path on all the machines. Thanks Best Regards On Sat, Mar 19, 2016 at 4:01 AM, Ashok Kumar wrote: > Experts. > > Please your

Setting up spark to run on two nodes

2016-03-19 Thread Ashok Kumar
Experts. Please your valued advice. I have spark 1.5.2 set up as standalone for now and I have started the master as below start-master.sh I also have modified config/slave file to haveĀ  # A Spark Worker will be started on each of the machines listed below. localhostworkerhost On the localhost