Did you try setting the SPARK_LOCAL_IP in the conf/spark-env.sh file on
each node?

Thanks
Best Regards

On Fri, Oct 2, 2015 at 4:18 AM, markluk <m...@juicero.com> wrote:

> I'm running a standalone Spark cluster of 1 master and 2 slaves.
>
> My slaves file under /conf list the fully qualified domain name of the 2
> slave machines
>
> When I look on the Spark webpage ( on :8080), I see my 2 workers, but the
> worker ID uses the IP address , like
> worker-20151001153012-172.31.51.158-44699
>
> <
> http://apache-spark-user-list.1001560.n3.nabble.com/file/n24905/Screen_Shot_2015-10-01_at_3.png
> >
>
> That worker ID is not very human friendly. Is there a way to use the
> machine
> name in the ID instead? like
> worker-20151001153012-node1-44699
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-cluster-use-machine-name-in-WorkerID-not-IP-address-tp24905.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to