sorry I should just do this ./start-slave.sh spark://x.x.x.x:7077,y.y.y.y:7077,z.z.z.z:7077
but what about export SPARK_MASTER_HOST="x.x.x.x y.y.y.y z.z.z.z" ? Dont I need to have that on my worker node? Thanks! On Fri, Feb 3, 2017 at 4:57 PM, kant kodali <kanth...@gmail.com> wrote: > Hi, > > How do I start a slave? just run start-slave.sh script? but then I don't > understand the following. > > I put the following in spark-env.sh in the worker machine > > export SPARK_MASTER_HOST="x.x.x.x y.y.y.y z.z.z.z" > > but start-slave.sh doesn't seem to take SPARK_MASTER_HOST env variable. so > I did the following > > ./start-slave.sh spark://x.x.x.x:7077 spark://y.y.y.y:7077 > spark://z.z.z.z:7077 > > This didn't quite work either. any ideas? > > Thanks! > > > > > > > > On Wed, Jan 25, 2017 at 7:12 PM, Raghavendra Pandey < > raghavendra.pan...@gmail.com> wrote: > >> When you start a slave you pass address of master as a parameter. That >> slave will contact master and register itself. >> >> On Jan 25, 2017 4:12 AM, "kant kodali" <kanth...@gmail.com> wrote: >> >>> Hi, >>> >>> How do I dynamically add nodes to spark standalone cluster and be able >>> to discover them? Does Zookeeper do service discovery? What is the standard >>> tool for these things? >>> >>> Thanks, >>> kant >>> >> >