Hi Patrick,

 Getting following warning while running second user.

          WARN component.AbstractLifeCycle: FAILED
SelectChannelConnector@0.0.0.0:4040: java.net.BindException: Address
already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:344)
at sun.nio.ch.Net.bind(Net.java:336)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:199)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at
org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConnector.java:187)
at
org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:316)
at
org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelConnector.java:265)
at
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at org.eclipse.jetty.server.Server.doStart(Server.java:286)
at
org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:64)
at
org.apache.spark.ui.JettyUtils$$anonfun$connect$1$1.apply(JettyUtils.scala:118)
at
org.apache.spark.ui.JettyUtils$$anonfun$connect$1$1.apply(JettyUtils.scala:118)
at scala.util.Try$.apply(Try.scala:160)
at org.apache.spark.ui.JettyUtils$.connect$1(JettyUtils.scala:118)
at org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:129)
at org.apache.spark.ui.SparkUI.bind(SparkUI.scala:57)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:123)
at
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:548)
at
org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:84)
at
main.scala.edu.am.bigdata.spark.twitter.TwitterServices$.main(TwitterServices.scala:53)
at
main.scala.edu.am.bigdata.spark.twitter.TwitterServices.main(TwitterServices.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at sbt.Run.invokeMain(Run.scala:68)
at sbt.Run.run0(Run.scala:61)
at sbt.Run.execute$1(Run.scala:50)
at sbt.Run$$anonfun$run$1.apply$mcV$sp(Run.scala:54)
at sbt.TrapExit$.executeMain$1(TrapExit.scala:33)
at sbt.TrapExit$$anon$1.run(TrapExit.scala:42)





On Sat, Nov 23, 2013 at 9:08 AM, prabeesh k <prabsma...@gmail.com> wrote:

> Running a stream job using different users with spark-0.8.0. The output
> publishing into a queue as json format using spray json.   After some
> hours, noticed that only getting the output for the first user and getting
> the keys for other users in the json string. Actually the value in the json
> is the output of spark streaming streaming job.
> My additional Spark streaming  settings are.
>
>     System.setProperty("spark.driver.port", "0")
>     System.setProperty("spark.cores.max", "1")
>     System.setProperty("spark.mesos.coarse", "true")
>     System.setProperty("spark.executor.memory", "2g")
>     System.setProperty("spark.scheduler.mode", "FAIR")
>
> Are these settings correct for long running multi-user Spark streaming
> job. Am I doing anything wrong please suggest.
>
> Thanks,
>          prabeesh
>
>
>
>
> On Sat, Nov 23, 2013 at 5:26 AM, Patrick Wendell <pwend...@gmail.com>wrote:
>
>> Spark has good support for multi-tenancy. Prabeesh could you describe
>> in more detail what you are actually running, what the expected
>> behavior is and what you are observing?
>>
>> On Fri, Nov 22, 2013 at 3:09 AM, Sam Bessalah <samkil...@gmail.com>
>> wrote:
>> > Spark doesn't support multi users, as far as I can tell.
>> >
>> > Sam Bessalah
>> >
>> >> On Nov 22, 2013, at 7:54 AM, prabeesh k <prabsma...@gmail.com> wrote:
>> >>
>> >> Hi all,
>> >>        While running a Spark job using different users, getting only
>> output of first user. After some hours  no output for other users . Please
>> help me , please suggest the settings for multi-user.
>> >>
>> >>   Thanks in Advance
>> >> Regars,
>> >>        Prabeesh
>>
>
>

Reply via email to