[ 
https://issues.apache.org/jira/browse/SPARK-22365?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16224452#comment-16224452
 ] 

Jakub Dubovsky commented on SPARK-22365:
----------------------------------------

[~srowen]  I will look into log for "caused by" entries. Also I think this 
happens all the time in my setup so I am able to reproduce it. If it's one time 
error I wouldn't bother creating a ticket.
[~guoxiaolongzte] I am willing to do that but I have no idea what do you mean 
by snapshot.

I will be slow on debugging this as it is not my top priority now but want to 
get it resolved since it prevents me from checking logs on executors while app 
is running.

> Spark UI executors empty list with 500 error
> --------------------------------------------
>
>                 Key: SPARK-22365
>                 URL: https://issues.apache.org/jira/browse/SPARK-22365
>             Project: Spark
>          Issue Type: Bug
>          Components: Web UI
>    Affects Versions: 2.2.0
>            Reporter: Jakub Dubovsky
>
> No data loaded on "execturos" tab in sparkUI with stack trace below. Apart 
> from exception I have nothing more. But if I can test something to make this 
> easier to resolve I am happy to help.
> {{java.lang.NullPointerException
>       at 
> org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:388)
>       at 
> org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:341)
>       at 
> org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:228)
>       at 
> org.spark_project.jetty.servlet.ServletHolder.handle(ServletHolder.java:845)
>       at 
> org.spark_project.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1689)
>       at 
> org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.doFilter(AmIpFilter.java:164)
>       at 
> org.spark_project.jetty.servlet.ServletHandler$CachedChain.doFilter(ServletHandler.java:1676)
>       at 
> org.spark_project.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:581)
>       at 
> org.spark_project.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1180)
>       at 
> org.spark_project.jetty.servlet.ServletHandler.doScope(ServletHandler.java:511)
>       at 
> org.spark_project.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1112)
>       at 
> org.spark_project.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
>       at 
> org.spark_project.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:461)
>       at 
> org.spark_project.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:213)
>       at 
> org.spark_project.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
>       at org.spark_project.jetty.server.Server.handle(Server.java:524)
>       at 
> org.spark_project.jetty.server.HttpChannel.handle(HttpChannel.java:319)
>       at 
> org.spark_project.jetty.server.HttpConnection.onFillable(HttpConnection.java:253)
>       at 
> org.spark_project.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:273)
>       at 
> org.spark_project.jetty.io.FillInterest.fillable(FillInterest.java:95)
>       at 
> org.spark_project.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93)
>       at 
> org.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303)
>       at 
> org.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148)
>       at 
> org.spark_project.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136)
>       at 
> org.spark_project.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671)
>       at 
> org.spark_project.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589)
>       at java.lang.Thread.run(Thread.java:748)}}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to