Some old bits:

http://stackoverflow.com/questions/28162991/cant-run-spark-1-2-in-standalone-mode-on-mac
http://stackoverflow.com/questions/29412157/passing-hostname-to-netty

FYI

On Wed, Oct 14, 2015 at 7:10 PM, Nicholas Chammas <
nicholas.cham...@gmail.com> wrote:

> I’m setting the Spark master address via the SPARK_MASTER_IP environment
> variable in spark-env.sh, like spark-ec2 does
> <https://github.com/amplab/spark-ec2/blob/a990752575cd8b0ab25731d7820a55c714798ec3/templates/root/spark/conf/spark-env.sh#L13>
> .
>
> The funny thing is that Spark seems to accept this only if the value of
> SPARK_MASTER_IP is a DNS name and not an IP address.
>
> When I provide an IP address, I get errors in the log when starting the
> master:
>
> 15/10/15 01:47:31 ERROR NettyTransport: failed to bind to /54.210.XX.XX:7077, 
> shutting down Netty transport
>
> (XX is my redaction of the full IP address.)
>
> Am I misunderstanding something about how to use this environment variable?
>
> The spark-env.sh template indicates that either an IP address or a
> hostname should work
> <https://github.com/apache/spark/blob/4ace4f8a9c91beb21a0077e12b75637a4560a542/conf/spark-env.sh.template#L49>,
> but my testing shows that only hostnames work.
>
> Nick
> ​
>

Reply via email to