[jira] [Closed] (SPARK-20181) Avoid noisy Jetty WARN log when failing to bind a port

2017-04-07 Thread Derek Dagit (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-20181?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Derek Dagit closed SPARK-20181.
---
Resolution: Invalid

This is no longer an issue in master because the log level is already set such 
that the message with the stack trace does not appear.

> Avoid noisy Jetty WARN log when failing to bind a port
> --
>
> Key: SPARK-20181
> URL: https://issues.apache.org/jira/browse/SPARK-20181
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 2.1.0
>Reporter: Derek Dagit
>Priority: Minor
>
> As a user, I would like to suppress the Jetty WARN log about failing to bind 
> to a port already in use, so that my logs are less noisy.
> Currently, Jetty code prints the stack trace of the BindException at WARN 
> level. In the context of starting a service on an ephemeral port, this is not 
> a useful warning, and it is exceedingly verbose.
> {noformat}
> 17/03/06 14:57:26 WARN AbstractLifeCycle: FAILED 
> ServerConnector@79476a4e{HTTP/1.1}{0.0.0.0:4040}: java.net.BindException: 
> Address already in use
> java.net.BindException: Address already in use
>   at sun.nio.ch.Net.bind0(Native Method)
>   at sun.nio.ch.Net.bind(Net.java:433)
>   at sun.nio.ch.Net.bind(Net.java:425)
>   at 
> sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
>   at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
>   at 
> org.spark_project.jetty.server.ServerConnector.open(ServerConnector.java:321)
>   at 
> org.spark_project.jetty.server.AbstractNetworkConnector.doStart(AbstractNetworkConnector.java:80)
>   at 
> org.spark_project.jetty.server.ServerConnector.doStart(ServerConnector.java:236)
>   at 
> org.spark_project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
>   at org.spark_project.jetty.server.Server.doStart(Server.java:366)
>   at 
> org.spark_project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
>   at 
> org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:306)
>   at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:316)
>   at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:316)
>   at 
> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2175)
>   at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
>   at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2166)
>   at 
> org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:316)
>   at org.apache.spark.ui.WebUI.bind(WebUI.scala:139)
>   at 
> org.apache.spark.SparkContext$$anonfun$10.apply(SparkContext.scala:448)
>   at 
> org.apache.spark.SparkContext$$anonfun$10.apply(SparkContext.scala:448)
>   at scala.Option.foreach(Option.scala:257)
>   at org.apache.spark.SparkContext.(SparkContext.scala:448)
>   at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2282)
>   at 
> org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:831)
>   at 
> org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:823)
>   at scala.Option.getOrElse(Option.scala:121)
>   at 
> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823)
>   at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
>   at $line3.$read$$iw$$iw.(:15)
>   at $line3.$read$$iw.(:31)
>   at $line3.$read.(:33)
>   at $line3.$read$.(:37)
>   at $line3.$read$.()
>   at $line3.$eval$.$print$lzycompute(:7)
>   at $line3.$eval$.$print(:6)
>   at $line3.$eval.$print()
>   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>   at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>   at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>   at java.lang.reflect.Method.invoke(Method.java:498)
>   at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
>   at 
> scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
>   at 
> scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
>   at 
> scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
>   at 
> scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
>   at 
> scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
>   at 
> scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
>   at 

[jira] [Commented] (SPARK-20181) Avoid noisy Jetty WARN log when failing to bind a port

2017-03-31 Thread Derek Dagit (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-20181?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15951686#comment-15951686
 ] 

Derek Dagit commented on SPARK-20181:
-

Working on this...

> Avoid noisy Jetty WARN log when failing to bind a port
> --
>
> Key: SPARK-20181
> URL: https://issues.apache.org/jira/browse/SPARK-20181
> Project: Spark
>  Issue Type: Improvement
>  Components: Spark Core
>Affects Versions: 2.1.0
>Reporter: Derek Dagit
>Priority: Minor
>
> As a user, I would like to suppress the Jetty WARN log about failing to bind 
> to a port already in use, so that my logs are less noisy.
> Currently, Jetty code prints the stack trace of the BindException at WARN 
> level. In the context of starting a service on an ephemeral port, this is not 
> a useful warning, and it is exceedingly verbose.
> {noformat}
> 17/03/06 14:57:26 WARN AbstractLifeCycle: FAILED 
> ServerConnector@79476a4e{HTTP/1.1}{0.0.0.0:4040}: java.net.BindException: 
> Address already in use
> java.net.BindException: Address already in use
>   at sun.nio.ch.Net.bind0(Native Method)
>   at sun.nio.ch.Net.bind(Net.java:433)
>   at sun.nio.ch.Net.bind(Net.java:425)
>   at 
> sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
>   at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
>   at 
> org.spark_project.jetty.server.ServerConnector.open(ServerConnector.java:321)
>   at 
> org.spark_project.jetty.server.AbstractNetworkConnector.doStart(AbstractNetworkConnector.java:80)
>   at 
> org.spark_project.jetty.server.ServerConnector.doStart(ServerConnector.java:236)
>   at 
> org.spark_project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
>   at org.spark_project.jetty.server.Server.doStart(Server.java:366)
>   at 
> org.spark_project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
>   at 
> org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:306)
>   at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:316)
>   at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:316)
>   at 
> org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2175)
>   at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
>   at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2166)
>   at 
> org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:316)
>   at org.apache.spark.ui.WebUI.bind(WebUI.scala:139)
>   at 
> org.apache.spark.SparkContext$$anonfun$10.apply(SparkContext.scala:448)
>   at 
> org.apache.spark.SparkContext$$anonfun$10.apply(SparkContext.scala:448)
>   at scala.Option.foreach(Option.scala:257)
>   at org.apache.spark.SparkContext.(SparkContext.scala:448)
>   at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2282)
>   at 
> org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:831)
>   at 
> org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:823)
>   at scala.Option.getOrElse(Option.scala:121)
>   at 
> org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823)
>   at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
>   at $line3.$read$$iw$$iw.(:15)
>   at $line3.$read$$iw.(:31)
>   at $line3.$read.(:33)
>   at $line3.$read$.(:37)
>   at $line3.$read$.()
>   at $line3.$eval$.$print$lzycompute(:7)
>   at $line3.$eval$.$print(:6)
>   at $line3.$eval.$print()
>   at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>   at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>   at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>   at java.lang.reflect.Method.invoke(Method.java:498)
>   at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
>   at 
> scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
>   at 
> scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
>   at 
> scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
>   at 
> scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
>   at 
> scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
>   at 
> scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
>   at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
>   at 

[jira] [Created] (SPARK-20181) Avoid noisy Jetty WARN log when failing to bind a port

2017-03-31 Thread Derek Dagit (JIRA)
Derek Dagit created SPARK-20181:
---

 Summary: Avoid noisy Jetty WARN log when failing to bind a port
 Key: SPARK-20181
 URL: https://issues.apache.org/jira/browse/SPARK-20181
 Project: Spark
  Issue Type: Improvement
  Components: Spark Core
Affects Versions: 2.1.0
Reporter: Derek Dagit
Priority: Minor


As a user, I would like to suppress the Jetty WARN log about failing to bind to 
a port already in use, so that my logs are less noisy.

Currently, Jetty code prints the stack trace of the BindException at WARN 
level. In the context of starting a service on an ephemeral port, this is not a 
useful warning, and it is exceedingly verbose.

{noformat}
17/03/06 14:57:26 WARN AbstractLifeCycle: FAILED 
ServerConnector@79476a4e{HTTP/1.1}{0.0.0.0:4040}: java.net.BindException: 
Address already in use
java.net.BindException: Address already in use
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at 
sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at 
org.spark_project.jetty.server.ServerConnector.open(ServerConnector.java:321)
at 
org.spark_project.jetty.server.AbstractNetworkConnector.doStart(AbstractNetworkConnector.java:80)
at 
org.spark_project.jetty.server.ServerConnector.doStart(ServerConnector.java:236)
at 
org.spark_project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at org.spark_project.jetty.server.Server.doStart(Server.java:366)
at 
org.spark_project.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle.java:68)
at 
org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(JettyUtils.scala:306)
at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:316)
at org.apache.spark.ui.JettyUtils$$anonfun$5.apply(JettyUtils.scala:316)
at 
org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Utils.scala:2175)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:160)
at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:2166)
at 
org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:316)
at org.apache.spark.ui.WebUI.bind(WebUI.scala:139)
at 
org.apache.spark.SparkContext$$anonfun$10.apply(SparkContext.scala:448)
at 
org.apache.spark.SparkContext$$anonfun$10.apply(SparkContext.scala:448)
at scala.Option.foreach(Option.scala:257)
at org.apache.spark.SparkContext.(SparkContext.scala:448)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2282)
at 
org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:831)
at 
org.apache.spark.sql.SparkSession$Builder$$anonfun$8.apply(SparkSession.scala:823)
at scala.Option.getOrElse(Option.scala:121)
at 
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823)
at org.apache.spark.repl.Main$.createSparkSession(Main.scala:95)
at $line3.$read$$iw$$iw.(:15)
at $line3.$read$$iw.(:31)
at $line3.$read.(:33)
at $line3.$read$.(:37)
at $line3.$read$.()
at $line3.$eval$.$print$lzycompute(:7)
at $line3.$eval$.$print(:6)
at $line3.$eval.$print()
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:786)
at 
scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1047)
at 
scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:638)
at 
scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:637)
at 
scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)
at 
scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)
at 
scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:637)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:569)
at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
at 
scala.tools.nsc.interpreter.ILoop.interpretStartingWith(ILoop.scala:807)
at scala.tools.nsc.interpreter.ILoop.command(ILoop.scala:681)
at scala.tools.nsc.interpreter.ILoop.processLine(ILoop.scala:395)
at 

[jira] [Commented] (SPARK-11185) Add more task metrics to the "all Stages Page"

2015-12-28 Thread Derek Dagit (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-11185?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15072870#comment-15072870
 ] 

Derek Dagit commented on SPARK-11185:
-

[~pwendell], [~kayousterhout], [~andrewor14], [~irashid]

Let's figure out how we might address this use case. Do we think there is a way 
to present this information without cluttering the UI?



> Add more task metrics to the "all Stages Page"
> --
>
> Key: SPARK-11185
> URL: https://issues.apache.org/jira/browse/SPARK-11185
> Project: Spark
>  Issue Type: Improvement
>  Components: Web UI
>Affects Versions: 1.5.1
>Reporter: Thomas Graves
>
> The "All Stages Page" on the History page could have more information about 
> the stage to allow users to quickly see which stage potentially has long 
> tasks. Indicator or skewed data or bad nodes, etc.  
> Currently to get this information you have to click on every stage.  If you 
> have a hundreds of stages this can be very cumbersome.
> For instance pulling out the max task time and the median to the all stages 
> page would allow me to see the difference and if the max task time is much 
> greater then the median this stage may have had tasks with problems.  
> We already had some discussion about this under 
> https://github.com/apache/spark/pull/9051



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-10930) History "Stages" page "duration" can be confusing

2015-10-08 Thread Derek Dagit (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-10930?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14949313#comment-14949313
 ] 

Derek Dagit commented on SPARK-10930:
-

We should create a new column with the maximum duration of all tasks in the 
stage.  This will let us identify tasks that took the longest time.

> History "Stages" page "duration" can be confusing
> -
>
> Key: SPARK-10930
> URL: https://issues.apache.org/jira/browse/SPARK-10930
> Project: Spark
>  Issue Type: Improvement
>  Components: Web UI
>Affects Versions: 1.5.1
>Reporter: Thomas Graves
>
> The spark history server, "stages" page shows each stage submitted time and 
> the duration.  The duration can be confusing since the time it actually 
> starts tasks might be much later then its submitted if its waiting on 
> previous stages.  This makes it hard to figure out which stages were really 
> slow without clicking into each stage.
> It would be nice to perhaps have a first task launched time or processing 
> time spent in each stage to easily be able to find the slow stages.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org