scala: java.net.BindException?

2014-10-16 Thread ll
hello... does anyone know how to resolve this issue?  i'm running this
locally on my computer.  keep getting this BindException.  much appreciated.

14/10/16 17:48:13 WARN component.AbstractLifeCycle: FAILED
SelectChannelConnector@0.0.0.0:4040: java.net.BindException: Address already
in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:444)
at sun.nio.ch.Net.bind(Net.java:436)
at
sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at
org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at
org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at
org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.eclipse.jetty.server.Server.doStart(Server.java:293)
at
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at
org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:192)
at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:202)
at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:202)
at
org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
at 
org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:202)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:102)
at org.apache.spark.SparkContext.init(SparkContext.scala:224)
at
nn.SimpleNeuralNetwork$delayedInit$body.apply(SimpleNeuralNetwork.scala:15)
at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
at 
scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:71)
at scala.App$$anonfun$main$1.apply(App.scala:71)
at scala.collection.immutable.List.foreach(List.scala:318)
at
scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
at scala.App$class.main(App.scala:71)



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/scala-java-net-BindException-tp16624.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: scala: java.net.BindException?

2014-10-16 Thread Duy Huynh
thanks marcelo.  i only instantiated sparkcontext once, at the beginning,
in this code.  the exception was thrown right at the beginning.

i also tried to run other programs, which worked fine previously, but now
also got the same error.

it looks like it put global block on creating sparkcontext that prevents
any program to create a sparkcontext.



On Oct 16, 2014 6:26 PM, Marcelo Vanzin van...@cloudera.com wrote:

 This error is not fatal, since Spark will retry on a different port..
 but this might be a problem, for different reasons, if somehow your
 code is trying to instantiate multiple SparkContexts.

 I assume nn.SimpleNeuralNetwork is part of your application, and
 since it seems to be instantiating a new SparkContext and also is
 being called from an iteration, that looks sort of fishy.

 On Thu, Oct 16, 2014 at 2:51 PM, ll duy.huynh@gmail.com wrote:
  hello... does anyone know how to resolve this issue?  i'm running this
  locally on my computer.  keep getting this BindException.  much
 appreciated.
 
  14/10/16 17:48:13 WARN component.AbstractLifeCycle: FAILED
  SelectChannelConnector@0.0.0.0:4040: java.net.BindException: Address
 already
  in use
  java.net.BindException: Address already in use
  at sun.nio.ch.Net.bind0(Native Method)
  at sun.nio.ch.Net.bind(Net.java:444)
  at sun.nio.ch.Net.bind(Net.java:436)
  at
  sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
  at
 sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
  at
 
 org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
  at
 
 org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
  at
 
 org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
  at
 
 org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
  at org.eclipse.jetty.server.Server.doStart(Server.java:293)
  at
 
 org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
  at
 
 org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:192)
  at
 org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:202)
  at
 org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:202)
  at
 
 org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:1446)
  at
 scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
  at
 org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
  at
 org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:202)
  at org.apache.spark.ui.WebUI.bind(WebUI.scala:102)
  at org.apache.spark.SparkContext.init(SparkContext.scala:224)
  at
 
 nn.SimpleNeuralNetwork$delayedInit$body.apply(SimpleNeuralNetwork.scala:15)
  at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
  at
 scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
  at scala.App$$anonfun$main$1.apply(App.scala:71)
  at scala.App$$anonfun$main$1.apply(App.scala:71)
  at scala.collection.immutable.List.foreach(List.scala:318)
  at
 
 scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
  at scala.App$class.main(App.scala:71)
 
 
 
  --
  View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/scala-java-net-BindException-tp16624.html
  Sent from the Apache Spark User List mailing list archive at Nabble.com.
 
  -
  To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
  For additional commands, e-mail: user-h...@spark.apache.org
 



 --
 Marcelo