[ https://issues.apache.org/jira/browse/SPARK-50757?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17954226#comment-17954226 ]
Yu Tang commented on SPARK-50757: --------------------------------- I got the same log from SHS but without the exception stack, and the front page also has a rendering problem: The request for spark-sql-viz.js gets a HTTP 500 error and the request for spark-sql-viz.css gets a HTTP 302 several times until the browser kills it. The path to the static resource seems right to me as it [mounts|https://github.com/apache/spark/blob/bba8cf48d14f91109eea04e22fd19be188fce5fb/sql/core/src/main/scala/org/apache/spark/sql/execution/ui/SQLTab.scala#L37] to "/static/sql". > SHS UI logs Jetty NPE when trying to load /static/sql/spark-sql-viz.js > ---------------------------------------------------------------------- > > Key: SPARK-50757 > URL: https://issues.apache.org/jira/browse/SPARK-50757 > Project: Spark > Issue Type: Bug > Components: UI > Affects Versions: 3.4.3 > Reporter: Rhys Jones > Priority: Minor > > I have been reviewing logs from our Spark History Server, and have noticed a > fairly infrequent occurrence of a NullPointerException getting logged as a > WARN level from Jetty server threads. The log message itself is simply: > {noformat} > /static/sql/spark-sql-viz.js{noformat} > coming from the logger: > {noformat} > org.sparkproject.jetty.server.HttpChannel{noformat} > with full exception: > {code:java} > java.lang.NullPointerException: Cannot invoke > "javax.servlet.Filter.doFilter(javax.servlet.ServletRequest, > javax.servlet.ServletResponse, javax.servlet.FilterChain)" because the return > value of "org.sparkproject.jetty.servlet.FilterHolder.getFilter()" is null > at > org.sparkproject.jetty.servlet.FilterHolder.doFilter(FilterHolder.java:193) > at > org.sparkproject.jetty.servlet.ServletHandler$Chain.doFilter(ServletHandler.java:1626) > at > org.sparkproject.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:552) > at > org.sparkproject.jetty.server.handler.ScopedHandler.nextHandle(ScopedHandler.java:233) > at > org.sparkproject.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1440) > at > org.sparkproject.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:188) > at > org.sparkproject.jetty.servlet.ServletHandler.doScope(ServletHandler.java:505) > at > org.sparkproject.jetty.server.handler.ScopedHandler.nextScope(ScopedHandler.java:186) > at > org.sparkproject.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1355) > at > org.sparkproject.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) > at > org.sparkproject.jetty.server.handler.gzip.GzipHandler.handle(GzipHandler.java:772) > at > org.sparkproject.jetty.server.handler.ContextHandlerCollection.handle(ContextHandlerCollection.java:234) > at > org.sparkproject.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:127) > at org.sparkproject.jetty.server.Server.handle(Server.java:516) > at > org.sparkproject.jetty.server.HttpChannel.lambda$handle$1(HttpChannel.java:487) > at > org.sparkproject.jetty.server.HttpChannel.dispatch(HttpChannel.java:732) > at > org.sparkproject.jetty.server.HttpChannel.handle(HttpChannel.java:479) > at > org.sparkproject.jetty.server.HttpConnection.onFillable(HttpConnection.java:277) > at > org.sparkproject.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:311) > at > org.sparkproject.jetty.io.FillInterest.fillable(FillInterest.java:105) > at > org.sparkproject.jetty.io.ChannelEndPoint$1.run(ChannelEndPoint.java:104) > at > org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.runTask(EatWhatYouKill.java:338) > at > org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.doProduce(EatWhatYouKill.java:315) > at > org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.tryProduce(EatWhatYouKill.java:173) > at > org.sparkproject.jetty.util.thread.strategy.EatWhatYouKill.run(EatWhatYouKill.java:131) > at > org.sparkproject.jetty.util.thread.ReservedThreadExecutor$ReservedThread.run(ReservedThreadExecutor.java:409) > at > org.sparkproject.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:883) > at > org.sparkproject.jetty.util.thread.QueuedThreadPool$Runner.run(QueuedThreadPool.java:1034) > at java.base/java.lang.Thread.run(Unknown Source) {code} > I believe the problem traces to the spark-sql code base, where the pathing to > that static js is incorrect in > [this|https://github.com/apache/spark/blame/bba8cf48d14f91109eea04e22fd19be188fce5fb/sql/core/src/main/scala/org/apache/spark/sql/execution/ui/ExecutionPage.scala#L105] > line of code. I can confirm the js script is in "/static/spark-sql-viz.js" > and not in "/static/sql/spak-sql-viz.js" from my runtime environment by: > {noformat} > jar tvf spark/jars/spark-sql_2.13-3.5.3.jar | grep static > 0 Mon Sep 09 05:02:40 CDT 2024 org/apache/spark/sql/execution/ui/static/ > 1573 Mon Sep 09 05:02:40 CDT 2024 > org/apache/spark/sql/execution/ui/static/spark-sql-viz.css > 7382 Mon Sep 09 05:02:40 CDT 2024 > org/apache/spark/sql/execution/ui/static/spark-sql-viz.js{noformat} > I'm unsure what problems this actually causes, as the warning gets logged > pretty infrequently and I haven't had any users yet indicate they were facing > problems rendering their SQL tabs from the SHS UI. Our current version of SHS > is on Spark 3.5.3, Scala 2.13, JDK 17, bundled to run in a 3-node cluster > containerized in Docker. > -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org