Can you show the full stack trace ?

Which Spark release are you using ?

Thanks



> On Jul 27, 2015, at 10:07 AM, Wayne Song <wayne.e.s...@gmail.com> wrote:
> 
> Hello,
> 
> I am trying to start a Spark master for a standalone cluster on an EC2 node. 
> The CLI command I'm using looks like this:
> 
> 
> 
> Note that I'm specifying the --host argument; I want my Spark master to be
> listening on a specific IP address.  The host that I'm specifying (i.e.
> 54.xx.xx.xx) is the public IP for my EC2 node; I've confirmed that nothing
> else is listening on port 7077 and that my EC2 security group has all ports
> open.  I've also double-checked that the public IP is correct.
> 
> When I use --host 54.xx.xx.xx, I get the following error message:
> 
> 
> 
> This does not occur if I leave out the --host argument and it doesn't occur
> if I use --host 10.0.xx.xx, where 10.0.xx.xx is my private EC2 IP address.
> 
> Why would Spark fail to bind to a public EC2 address?
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Getting-java-net-BindException-when-attempting-to-start-Spark-master-on-EC2-node-with-public-IP-tp24011.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
> 
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to