This is an automated email from the ASF dual-hosted git repository.

srowen pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 3331d4ccb7d [SPARK-39620][WEB UI] Use same condition in history server 
page and API to filter applications
3331d4ccb7d is described below

commit 3331d4ccb7df9aeb1972ed86472269a9dbd261ff
Author: kuwii <kuwii.some...@gmail.com>
AuthorDate: Fri Jul 8 08:20:27 2022 -0500

    [SPARK-39620][WEB UI] Use same condition in history server page and API to 
filter applications
    
    ### What changes were proposed in this pull request?
    
    Updated REST API `/api/v1/applications`, to use the same condition as 
history server page to filter completed/incomplete applications.
    
    ### Why are the changes needed?
    
    When opening summary page, history server follows this logic:
    
    - If there's completed/incomplete application, page will add script in 
response, using AJAX to call the REST API to get the filtered list.
    - If there's no such application, page will only return a message telling 
nothing found.
    
    Issue is that page and REST API are using different conditions to filter 
applications. In `HistoryPage`, an application is considered as completed as 
long as the last attempt is completed. But in `ApplicationListResource`, all 
attempts should be completed. This brings inconsistency and will cause issue in 
a corner case.
    
    In driver, event queues have capacity to protect memory. When there's too 
many events, some of them will be dropped and the event log file will be 
incomplete. For an application with multiple attempts, there's possibility that 
the last attempt is completed, but the previous attempts is considered as 
incomplete due to loss of application end event.
    
    For this type of application, page thinks it is completed, but the API 
thinks it is still running. When opening summary page:
    - When checking completed applications, page will call script, but API 
returns nothing.
    - When checking incomplete applications, page returns nothing.
    
    So the user won't be able to see this app in history server.
    
    ### Does this PR introduce _any_ user-facing change?
    
    Yes, there will be a change on `/api/v1/applications` API and history 
server summary page.
    
    When calling API, for application mentioned above, previously it is 
considered as running. After the change it is considered as completed. So the 
result will be different using same filter. But this change should be OK. 
Because attempts are executed sequentially and incrementally. So if an attempt 
with bigger ID is completed, the previous attempts can be considered as 
completed.
    
    For history server summary page, previously user is not able to see the 
application. Now it will appear in the completed applications.
    
    ### How was this patch tested?
    
    Add a new unit test `HistoryServerPageSuite`, which will check whether 
`HistoryPage` behaves the same as `ApplicationListResource` when filtering 
applications. To implement the test, there's a minor change of `HistoryPage`, 
exposing a method called `shouldDisplayApplications` to tell whether the 
summary page will display applications.
    
    The test verifies that:
    - If no completed/incomplete application found, `HistoryPage` should not 
display applications, and API should return an empty list.
    - Otherwise, `HistoryPage` should display applications, and API should 
return a non-empty list.
    
    Currently 2 scenarios are included:
    - Application with last attempt completed but previous attempt incomplete.
    - Application with last attempt incomplete but previous attempt completed.
    
    Closes #37008 from kuwii/kuwii/hs-fix.
    
    Authored-by: kuwii <kuwii.some...@gmail.com>
    Signed-off-by: Sean Owen <sro...@gmail.com>
---
 .../apache/spark/deploy/history/HistoryPage.scala  |   7 +-
 .../status/api/v1/ApplicationListResource.scala    |   2 +-
 .../application_1656321732247_0006_1               |  10 ++
 .../application_1656321732247_0006_2               |   9 ++
 .../application_1656321732247_0006_1               |   9 ++
 .../application_1656321732247_0006_2               |  10 ++
 .../deploy/history/HistoryServerPageSuite.scala    | 103 +++++++++++++++++++++
 dev/.rat-excludes                                  |   1 +
 8 files changed, 148 insertions(+), 3 deletions(-)

diff --git 
a/core/src/main/scala/org/apache/spark/deploy/history/HistoryPage.scala 
b/core/src/main/scala/org/apache/spark/deploy/history/HistoryPage.scala
index 26bc11a4878..f2cd5b7e240 100644
--- a/core/src/main/scala/org/apache/spark/deploy/history/HistoryPage.scala
+++ b/core/src/main/scala/org/apache/spark/deploy/history/HistoryPage.scala
@@ -30,8 +30,7 @@ private[history] class HistoryPage(parent: HistoryServer) 
extends WebUIPage("")
     val requestedIncomplete = Option(request.getParameter("showIncomplete"))
       .getOrElse("false").toBoolean
 
-    val displayApplications = parent.getApplicationList()
-      .exists(isApplicationCompleted(_) != requestedIncomplete)
+    val displayApplications = shouldDisplayApplications(requestedIncomplete)
     val eventLogsUnderProcessCount = parent.getEventLogsUnderProcess()
     val lastUpdatedTime = parent.getLastUpdatedTime()
     val providerConfig = parent.getProviderConfig()
@@ -91,6 +90,10 @@ private[history] class HistoryPage(parent: HistoryServer) 
extends WebUIPage("")
     UIUtils.basicSparkPage(request, content, "History Server", true)
   }
 
+  def shouldDisplayApplications(requestedIncomplete: Boolean): Boolean = {
+    parent.getApplicationList().exists(isApplicationCompleted(_) != 
requestedIncomplete)
+  }
+
   private def makePageLink(request: HttpServletRequest, showIncomplete: 
Boolean): String = {
     UIUtils.prependBaseUri(request, "/?" + "showIncomplete=" + showIncomplete)
   }
diff --git 
a/core/src/main/scala/org/apache/spark/status/api/v1/ApplicationListResource.scala
 
b/core/src/main/scala/org/apache/spark/status/api/v1/ApplicationListResource.scala
index 197cf64ebdc..6eb8b2bfd55 100644
--- 
a/core/src/main/scala/org/apache/spark/status/api/v1/ApplicationListResource.scala
+++ 
b/core/src/main/scala/org/apache/spark/status/api/v1/ApplicationListResource.scala
@@ -38,7 +38,7 @@ private[v1] class ApplicationListResource extends 
ApiRequestContext {
     val includeRunning = status.isEmpty || 
status.contains(ApplicationStatus.RUNNING)
 
     uiRoot.getApplicationInfoList.filter { app =>
-      val anyRunning = app.attempts.exists(!_.completed)
+      val anyRunning = app.attempts.isEmpty || !app.attempts.head.completed
       // if any attempt is still running, we consider the app to also still be 
running;
       // keep the app if *any* attempts fall in the right time window
       ((!anyRunning && includeCompleted) || (anyRunning && includeRunning)) &&
diff --git 
a/core/src/test/resources/spark-events-broken/last-attempt-incomplete/application_1656321732247_0006_1
 
b/core/src/test/resources/spark-events-broken/last-attempt-incomplete/application_1656321732247_0006_1
new file mode 100644
index 00000000000..835fa844fca
--- /dev/null
+++ 
b/core/src/test/resources/spark-events-broken/last-attempt-incomplete/application_1656321732247_0006_1
@@ -0,0 +1,10 @@
+{"Event":"SparkListenerLogStart","Spark Version":"3.4.0-SNAPSHOT"}
+{"Event":"SparkListenerResourceProfileAdded","Resource Profile Id":0,"Executor 
Resource Requests":{"cores":{"Resource Name":"cores","Amount":1,"Discovery 
Script":"","Vendor":""},"memory":{"Resource 
Name":"memory","Amount":1024,"Discovery 
Script":"","Vendor":""},"offHeap":{"Resource 
Name":"offHeap","Amount":0,"Discovery Script":"","Vendor":""}},"Task Resource 
Requests":{"cpus":{"Resource Name":"cpus","Amount":1.0}}}
+{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor 
ID":"driver","Host":"192.168.122.132","Port":40661},"Maximum 
Memory":384093388,"Timestamp":1656322531973,"Maximum Onheap 
Memory":384093388,"Maximum Offheap Memory":0}
+{"Event":"SparkListenerEnvironmentUpdate","JVM Information":{"Java 
Home":"/usr/lib/jvm/java-8-openjdk-amd64/jre","Java Version":"1.8.0_312 
(Private Build)","Scala Version":"version 2.12.16"},"Spark 
Properties":{"spark.executor.extraJavaOptions":"-Djava.net.preferIPv6Addresses=false
 -XX:+IgnoreUnrecognizedVMOptions --add-opens=java.base/java.lang=ALL-UNNAMED 
--add-opens=java.base/java.lang.invoke=ALL-UNNAMED 
--add-opens=java.base/java.lang.reflect=ALL-UNNAMED 
--add-opens=java.base/java.io [...]
+{"Event":"SparkListenerApplicationStart","App Name":"PythonPi","App 
ID":"application_1656321732247_0006","Timestamp":1656322530893,"User":"kuwii","App
 Attempt ID":"1","Driver 
Logs":{"stdout":"http://localhost:8042/node/containerlogs/container_1656321732247_0006_01_000001/kuwii/stdout?start=-4096","stderr":"http://localhost:8042/node/containerlogs/container_1656321732247_0006_01_000001/kuwii/stderr?start=-4096"},"Driver
 Attributes":{"NM_HTTP_ADDRESS":"localhost:8042","USER":"kuwii","LOG_F [...]
+{"Event":"SparkListenerExecutorAdded","Timestamp":1656322537728,"Executor 
ID":"1","Executor Info":{"Host":"localhost","Total Cores":1,"Log 
Urls":{"stdout":"http://localhost:8042/node/containerlogs/container_1656321732247_0006_01_000002/kuwii/stdout?start=-4096","stderr":"http://localhost:8042/node/containerlogs/container_1656321732247_0006_01_000002/kuwii/stderr?start=-4096"},"Attributes":{"NM_HTTP_ADDRESS":"localhost:8042","USER":"kuwii","LOG_FILES":"stderr,stdout","NM_HTTP_PORT":"8042";
 [...]
+{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor 
ID":"1","Host":"localhost","Port":41691},"Maximum 
Memory":384093388,"Timestamp":1656322537822,"Maximum Onheap 
Memory":384093388,"Maximum Offheap Memory":0}
+{"Event":"SparkListenerExecutorAdded","Timestamp":1656322538721,"Executor 
ID":"2","Executor Info":{"Host":"localhost","Total Cores":1,"Log 
Urls":{"stdout":"http://localhost:8042/node/containerlogs/container_1656321732247_0006_01_000003/kuwii/stdout?start=-4096","stderr":"http://localhost:8042/node/containerlogs/container_1656321732247_0006_01_000003/kuwii/stderr?start=-4096"},"Attributes":{"NM_HTTP_ADDRESS":"localhost:8042","USER":"kuwii","LOG_FILES":"stderr,stdout","NM_HTTP_PORT":"8042";
 [...]
+{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor 
ID":"2","Host":"localhost","Port":46145},"Maximum 
Memory":384093388,"Timestamp":1656322538831,"Maximum Onheap 
Memory":384093388,"Maximum Offheap Memory":0}
+{"Event":"SparkListenerApplicationEnd","Timestamp":1656322538998}
diff --git 
a/core/src/test/resources/spark-events-broken/last-attempt-incomplete/application_1656321732247_0006_2
 
b/core/src/test/resources/spark-events-broken/last-attempt-incomplete/application_1656321732247_0006_2
new file mode 100644
index 00000000000..1e7312f4ec3
--- /dev/null
+++ 
b/core/src/test/resources/spark-events-broken/last-attempt-incomplete/application_1656321732247_0006_2
@@ -0,0 +1,9 @@
+{"Event":"SparkListenerLogStart","Spark Version":"3.4.0-SNAPSHOT"}
+{"Event":"SparkListenerResourceProfileAdded","Resource Profile Id":0,"Executor 
Resource Requests":{"cores":{"Resource Name":"cores","Amount":1,"Discovery 
Script":"","Vendor":""},"memory":{"Resource 
Name":"memory","Amount":1024,"Discovery 
Script":"","Vendor":""},"offHeap":{"Resource 
Name":"offHeap","Amount":0,"Discovery Script":"","Vendor":""}},"Task Resource 
Requests":{"cpus":{"Resource Name":"cpus","Amount":1.0}}}
+{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor 
ID":"driver","Host":"192.168.122.132","Port":43289},"Maximum 
Memory":384093388,"Timestamp":1656322544350,"Maximum Onheap 
Memory":384093388,"Maximum Offheap Memory":0}
+{"Event":"SparkListenerEnvironmentUpdate","JVM Information":{"Java 
Home":"/usr/lib/jvm/java-8-openjdk-amd64/jre","Java Version":"1.8.0_312 
(Private Build)","Scala Version":"version 2.12.16"},"Spark 
Properties":{"spark.executor.extraJavaOptions":"-Djava.net.preferIPv6Addresses=false
 -XX:+IgnoreUnrecognizedVMOptions --add-opens=java.base/java.lang=ALL-UNNAMED 
--add-opens=java.base/java.lang.invoke=ALL-UNNAMED 
--add-opens=java.base/java.lang.reflect=ALL-UNNAMED 
--add-opens=java.base/java.io [...]
+{"Event":"SparkListenerApplicationStart","App Name":"PythonPi","App 
ID":"application_1656321732247_0006","Timestamp":1656322543203,"User":"kuwii","App
 Attempt ID":"2","Driver 
Logs":{"stdout":"http://localhost:8042/node/containerlogs/container_1656321732247_0006_02_000001/kuwii/stdout?start=-4096","stderr":"http://localhost:8042/node/containerlogs/container_1656321732247_0006_02_000001/kuwii/stderr?start=-4096"},"Driver
 Attributes":{"NM_HTTP_ADDRESS":"localhost:8042","USER":"kuwii","LOG_F [...]
+{"Event":"SparkListenerExecutorAdded","Timestamp":1656322550105,"Executor 
ID":"1","Executor Info":{"Host":"localhost","Total Cores":1,"Log 
Urls":{"stdout":"http://localhost:8042/node/containerlogs/container_1656321732247_0006_02_000002/kuwii/stdout?start=-4096","stderr":"http://localhost:8042/node/containerlogs/container_1656321732247_0006_02_000002/kuwii/stderr?start=-4096"},"Attributes":{"NM_HTTP_ADDRESS":"localhost:8042","USER":"kuwii","LOG_FILES":"stderr,stdout","NM_HTTP_PORT":"8042";
 [...]
+{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor 
ID":"1","Host":"localhost","Port":41059},"Maximum 
Memory":384093388,"Timestamp":1656322550218,"Maximum Onheap 
Memory":384093388,"Maximum Offheap Memory":0}
+{"Event":"SparkListenerExecutorAdded","Timestamp":1656322552392,"Executor 
ID":"2","Executor Info":{"Host":"localhost","Total Cores":1,"Log 
Urls":{"stdout":"http://localhost:8042/node/containerlogs/container_1656321732247_0006_02_000003/kuwii/stdout?start=-4096","stderr":"http://localhost:8042/node/containerlogs/container_1656321732247_0006_02_000003/kuwii/stderr?start=-4096"},"Attributes":{"NM_HTTP_ADDRESS":"localhost:8042","USER":"kuwii","LOG_FILES":"stderr,stdout","NM_HTTP_PORT":"8042";
 [...]
+{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor 
ID":"2","Host":"localhost","Port":42333},"Maximum 
Memory":384093388,"Timestamp":1656322552495,"Maximum Onheap 
Memory":384093388,"Maximum Offheap Memory":0}
diff --git 
a/core/src/test/resources/spark-events-broken/previous-attempt-incomplete/application_1656321732247_0006_1
 
b/core/src/test/resources/spark-events-broken/previous-attempt-incomplete/application_1656321732247_0006_1
new file mode 100644
index 00000000000..aebf0aa77a7
--- /dev/null
+++ 
b/core/src/test/resources/spark-events-broken/previous-attempt-incomplete/application_1656321732247_0006_1
@@ -0,0 +1,9 @@
+{"Event":"SparkListenerLogStart","Spark Version":"3.4.0-SNAPSHOT"}
+{"Event":"SparkListenerResourceProfileAdded","Resource Profile Id":0,"Executor 
Resource Requests":{"cores":{"Resource Name":"cores","Amount":1,"Discovery 
Script":"","Vendor":""},"memory":{"Resource 
Name":"memory","Amount":1024,"Discovery 
Script":"","Vendor":""},"offHeap":{"Resource 
Name":"offHeap","Amount":0,"Discovery Script":"","Vendor":""}},"Task Resource 
Requests":{"cpus":{"Resource Name":"cpus","Amount":1.0}}}
+{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor 
ID":"driver","Host":"192.168.122.132","Port":40661},"Maximum 
Memory":384093388,"Timestamp":1656322531973,"Maximum Onheap 
Memory":384093388,"Maximum Offheap Memory":0}
+{"Event":"SparkListenerEnvironmentUpdate","JVM Information":{"Java 
Home":"/usr/lib/jvm/java-8-openjdk-amd64/jre","Java Version":"1.8.0_312 
(Private Build)","Scala Version":"version 2.12.16"},"Spark 
Properties":{"spark.executor.extraJavaOptions":"-Djava.net.preferIPv6Addresses=false
 -XX:+IgnoreUnrecognizedVMOptions --add-opens=java.base/java.lang=ALL-UNNAMED 
--add-opens=java.base/java.lang.invoke=ALL-UNNAMED 
--add-opens=java.base/java.lang.reflect=ALL-UNNAMED 
--add-opens=java.base/java.io [...]
+{"Event":"SparkListenerApplicationStart","App Name":"PythonPi","App 
ID":"application_1656321732247_0006","Timestamp":1656322530893,"User":"kuwii","App
 Attempt ID":"1","Driver 
Logs":{"stdout":"http://localhost:8042/node/containerlogs/container_1656321732247_0006_01_000001/kuwii/stdout?start=-4096","stderr":"http://localhost:8042/node/containerlogs/container_1656321732247_0006_01_000001/kuwii/stderr?start=-4096"},"Driver
 Attributes":{"NM_HTTP_ADDRESS":"localhost:8042","USER":"kuwii","LOG_F [...]
+{"Event":"SparkListenerExecutorAdded","Timestamp":1656322537728,"Executor 
ID":"1","Executor Info":{"Host":"localhost","Total Cores":1,"Log 
Urls":{"stdout":"http://localhost:8042/node/containerlogs/container_1656321732247_0006_01_000002/kuwii/stdout?start=-4096","stderr":"http://localhost:8042/node/containerlogs/container_1656321732247_0006_01_000002/kuwii/stderr?start=-4096"},"Attributes":{"NM_HTTP_ADDRESS":"localhost:8042","USER":"kuwii","LOG_FILES":"stderr,stdout","NM_HTTP_PORT":"8042";
 [...]
+{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor 
ID":"1","Host":"localhost","Port":41691},"Maximum 
Memory":384093388,"Timestamp":1656322537822,"Maximum Onheap 
Memory":384093388,"Maximum Offheap Memory":0}
+{"Event":"SparkListenerExecutorAdded","Timestamp":1656322538721,"Executor 
ID":"2","Executor Info":{"Host":"localhost","Total Cores":1,"Log 
Urls":{"stdout":"http://localhost:8042/node/containerlogs/container_1656321732247_0006_01_000003/kuwii/stdout?start=-4096","stderr":"http://localhost:8042/node/containerlogs/container_1656321732247_0006_01_000003/kuwii/stderr?start=-4096"},"Attributes":{"NM_HTTP_ADDRESS":"localhost:8042","USER":"kuwii","LOG_FILES":"stderr,stdout","NM_HTTP_PORT":"8042";
 [...]
+{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor 
ID":"2","Host":"localhost","Port":46145},"Maximum 
Memory":384093388,"Timestamp":1656322538831,"Maximum Onheap 
Memory":384093388,"Maximum Offheap Memory":0}
diff --git 
a/core/src/test/resources/spark-events-broken/previous-attempt-incomplete/application_1656321732247_0006_2
 
b/core/src/test/resources/spark-events-broken/previous-attempt-incomplete/application_1656321732247_0006_2
new file mode 100644
index 00000000000..f5fd379c972
--- /dev/null
+++ 
b/core/src/test/resources/spark-events-broken/previous-attempt-incomplete/application_1656321732247_0006_2
@@ -0,0 +1,10 @@
+{"Event":"SparkListenerLogStart","Spark Version":"3.4.0-SNAPSHOT"}
+{"Event":"SparkListenerResourceProfileAdded","Resource Profile Id":0,"Executor 
Resource Requests":{"cores":{"Resource Name":"cores","Amount":1,"Discovery 
Script":"","Vendor":""},"memory":{"Resource 
Name":"memory","Amount":1024,"Discovery 
Script":"","Vendor":""},"offHeap":{"Resource 
Name":"offHeap","Amount":0,"Discovery Script":"","Vendor":""}},"Task Resource 
Requests":{"cpus":{"Resource Name":"cpus","Amount":1.0}}}
+{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor 
ID":"driver","Host":"192.168.122.132","Port":43289},"Maximum 
Memory":384093388,"Timestamp":1656322544350,"Maximum Onheap 
Memory":384093388,"Maximum Offheap Memory":0}
+{"Event":"SparkListenerEnvironmentUpdate","JVM Information":{"Java 
Home":"/usr/lib/jvm/java-8-openjdk-amd64/jre","Java Version":"1.8.0_312 
(Private Build)","Scala Version":"version 2.12.16"},"Spark 
Properties":{"spark.executor.extraJavaOptions":"-Djava.net.preferIPv6Addresses=false
 -XX:+IgnoreUnrecognizedVMOptions --add-opens=java.base/java.lang=ALL-UNNAMED 
--add-opens=java.base/java.lang.invoke=ALL-UNNAMED 
--add-opens=java.base/java.lang.reflect=ALL-UNNAMED 
--add-opens=java.base/java.io [...]
+{"Event":"SparkListenerApplicationStart","App Name":"PythonPi","App 
ID":"application_1656321732247_0006","Timestamp":1656322543203,"User":"kuwii","App
 Attempt ID":"2","Driver 
Logs":{"stdout":"http://localhost:8042/node/containerlogs/container_1656321732247_0006_02_000001/kuwii/stdout?start=-4096","stderr":"http://localhost:8042/node/containerlogs/container_1656321732247_0006_02_000001/kuwii/stderr?start=-4096"},"Driver
 Attributes":{"NM_HTTP_ADDRESS":"localhost:8042","USER":"kuwii","LOG_F [...]
+{"Event":"SparkListenerExecutorAdded","Timestamp":1656322550105,"Executor 
ID":"1","Executor Info":{"Host":"localhost","Total Cores":1,"Log 
Urls":{"stdout":"http://localhost:8042/node/containerlogs/container_1656321732247_0006_02_000002/kuwii/stdout?start=-4096","stderr":"http://localhost:8042/node/containerlogs/container_1656321732247_0006_02_000002/kuwii/stderr?start=-4096"},"Attributes":{"NM_HTTP_ADDRESS":"localhost:8042","USER":"kuwii","LOG_FILES":"stderr,stdout","NM_HTTP_PORT":"8042";
 [...]
+{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor 
ID":"1","Host":"localhost","Port":41059},"Maximum 
Memory":384093388,"Timestamp":1656322550218,"Maximum Onheap 
Memory":384093388,"Maximum Offheap Memory":0}
+{"Event":"SparkListenerExecutorAdded","Timestamp":1656322552392,"Executor 
ID":"2","Executor Info":{"Host":"localhost","Total Cores":1,"Log 
Urls":{"stdout":"http://localhost:8042/node/containerlogs/container_1656321732247_0006_02_000003/kuwii/stdout?start=-4096","stderr":"http://localhost:8042/node/containerlogs/container_1656321732247_0006_02_000003/kuwii/stderr?start=-4096"},"Attributes":{"NM_HTTP_ADDRESS":"localhost:8042","USER":"kuwii","LOG_FILES":"stderr,stdout","NM_HTTP_PORT":"8042";
 [...]
+{"Event":"SparkListenerBlockManagerAdded","Block Manager ID":{"Executor 
ID":"2","Host":"localhost","Port":42333},"Maximum 
Memory":384093388,"Timestamp":1656322552495,"Maximum Onheap 
Memory":384093388,"Maximum Offheap Memory":0}
+{"Event":"SparkListenerApplicationEnd","Timestamp":1656322552688}
diff --git 
a/core/src/test/scala/org/apache/spark/deploy/history/HistoryServerPageSuite.scala
 
b/core/src/test/scala/org/apache/spark/deploy/history/HistoryServerPageSuite.scala
new file mode 100644
index 00000000000..f6ef4f7b4f6
--- /dev/null
+++ 
b/core/src/test/scala/org/apache/spark/deploy/history/HistoryServerPageSuite.scala
@@ -0,0 +1,103 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contributor license agreements.  See the NOTICE file distributed with
+ * this work for additional information regarding copyright ownership.
+ * The ASF licenses this file to You under the Apache License, Version 2.0
+ * (the "License"); you may not use this file except in compliance with
+ * the License.  You may obtain a copy of the License at
+ *
+ *    http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.spark.deploy.history
+
+import java.net.URL
+import javax.servlet.http.HttpServletResponse
+
+import org.json4s.DefaultFormats
+import org.json4s.JsonAST._
+import org.json4s.jackson.JsonMethods.parse
+import org.scalatest.BeforeAndAfter
+
+import org.apache.spark.{SparkConf, SparkFunSuite}
+import org.apache.spark.internal.config.History._
+import org.apache.spark.internal.config.Tests._
+import org.apache.spark.status.api.v1.ApplicationStatus
+import org.apache.spark.util.Utils
+
+class HistoryServerPageSuite extends SparkFunSuite with BeforeAndAfter {
+  private implicit val format: DefaultFormats.type = DefaultFormats
+
+  private val logDirs = Seq(
+    getTestResourcePath("spark-events-broken/previous-attempt-incomplete"),
+    getTestResourcePath("spark-events-broken/last-attempt-incomplete")
+  )
+
+  private var server: Option[HistoryServer] = None
+  private val localhost: String = Utils.localHostNameForURI()
+  private var port: Int = -1
+
+  private def startHistoryServer(logDir: String): Unit = {
+    assert(server.isEmpty)
+    val conf = new SparkConf()
+      .set(HISTORY_LOG_DIR, logDir)
+      .set(UPDATE_INTERVAL_S.key, "0")
+      .set(IS_TESTING, true)
+    val provider = new FsHistoryProvider(conf)
+    provider.checkForLogs()
+    val securityManager = HistoryServer.createSecurityManager(conf)
+    val _server = new HistoryServer(conf, provider, securityManager, 18080)
+    _server.bind()
+    provider.start()
+    server = Some(_server)
+    port = _server.boundPort
+  }
+
+  private def stopHistoryServer(): Unit = {
+    server.foreach(_.stop())
+    server = None
+  }
+
+  private def callApplicationsAPI(requestedIncomplete: Boolean): Seq[JObject] 
= {
+    val param = if (requestedIncomplete) {
+      ApplicationStatus.RUNNING.toString.toLowerCase()
+    } else {
+      ApplicationStatus.COMPLETED.toString.toLowerCase()
+    }
+    val (code, jsonOpt, errOpt) = HistoryServerSuite.getContentAndCode(
+      new URL(s"http://$localhost:$port/api/v1/applications?status=$param";)
+    )
+    assert(code == HttpServletResponse.SC_OK)
+    assert(jsonOpt.isDefined)
+    assert(errOpt.isEmpty)
+    val json = parse(jsonOpt.get).extract[List[JObject]]
+    json
+  }
+
+  override def afterEach(): Unit = {
+    super.afterEach()
+    stopHistoryServer()
+  }
+
+  test("SPARK-39620: should behaves the same as REST API when filtering 
applications") {
+    logDirs.foreach { logDir =>
+      startHistoryServer(logDir)
+      val page = new HistoryPage(server.get)
+      Seq(true, false).foreach { requestedIncomplete =>
+        val apiResponse = callApplicationsAPI(requestedIncomplete)
+        if (page.shouldDisplayApplications(requestedIncomplete)) {
+          assert(apiResponse.nonEmpty)
+        } else {
+          assert(apiResponse.isEmpty)
+        }
+      }
+      stopHistoryServer()
+    }
+  }
+}
diff --git a/dev/.rat-excludes b/dev/.rat-excludes
index 2ce8abc0f87..e1cc000c064 100644
--- a/dev/.rat-excludes
+++ b/dev/.rat-excludes
@@ -138,3 +138,4 @@ over10k
 exported_table/*
 ansible-for-test-node/*
 node_modules
+spark-events-broken/*


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to