Passing --host localhost solved the issue, thanks!
Warm regards
Arko
On Mon, Feb 22, 2016 at 5:44 PM, Jakob Odersky wrote:
> Spark master by default binds to whatever ip address your current host
> resolves to. You have a few options to change that:
> - override the ip by
Spark master by default binds to whatever ip address your current host
resolves to. You have a few options to change that:
- override the ip by setting the environment variable SPARK_LOCAL_IP
- change the ip in your local "hosts" file (/etc/hosts on linux, not
sure on windows)
- specify a
Hello,
I am running Spark on Windows.
I start up master as follows:
.\spark-class.cmd org.apache.spark.deploy.master.Master
I see that the SparkMaster doesn't start on 127.0.0.1 but starts on my
"actual" IP. This is troublesome for me as I use it in my code and
need to change every time I