This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 39507b7f537 [SPARK-45263][CORE][TESTS] Make 
`EventLoggingListenerSuite` independent from `spark.eventLog.compress` conf
39507b7f537 is described below

commit 39507b7f537dc06af0ebf49afbd53c1e36c11776
Author: Dongjoon Hyun <dh...@apple.com>
AuthorDate: Thu Sep 21 15:31:35 2023 -0700

    [SPARK-45263][CORE][TESTS] Make `EventLoggingListenerSuite` independent 
from `spark.eventLog.compress` conf
    
    ### What changes were proposed in this pull request?
    
    This is a test-only PR to make `EventLoggingListenerSuite` independent from 
`spark.eventLog.compress` conf's default value.
    
    ### Why are the changes needed?
    
    Currently, `EventLoggingListenerSuite` test code has an assumption that the 
default value of `spark.eventLog.compress` is `false`. We had better make the 
assumption explicit.
    
    
https://github.com/apache/spark/blob/892fdc532696e703b353c4758320d69162fffe8c/core/src/test/scala/org/apache/spark/scheduler/EventLoggingListenerSuite.scala#L178
    
    ### Does this PR introduce _any_ user-facing change?
    
    No.
    
    ### How was this patch tested?
    
    Pass the CIs. Since we only clarify the assumption, the test suite should 
pass like the following.
    
    ```
    [info] EventLoggingListenerSuite:
    [info] - Basic event logging with compression (837 milliseconds)
    [info] - End-to-end event logging (2 seconds, 99 milliseconds)
    [info] - End-to-end event logging with compression (6 seconds, 966 
milliseconds)
    [info] - Event logging with password redaction (8 milliseconds)
    [info] - Spark-33504 sensitive attributes redaction in properties (15 
milliseconds)
    [info] - Executor metrics update (32 milliseconds)
    [info] - SPARK-31764: isBarrier should be logged in event log (262 
milliseconds)
    [info] Run completed in 11 seconds, 242 milliseconds.
    [info] Total number of tests run: 7
    [info] Suites: completed 1, aborted 0
    [info] Tests: succeeded 7, failed 0, canceled 0, ignored 0, pending 0
    [info] All tests passed.
    [success] Total time: 18 s, completed Sep 21, 2023, 2:34:50 PM
    ```
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #43040 from dongjoon-hyun/SPARK-45263.
    
    Authored-by: Dongjoon Hyun <dh...@apple.com>
    Signed-off-by: Dongjoon Hyun <dh...@apple.com>
---
 .../scala/org/apache/spark/scheduler/EventLoggingListenerSuite.scala   | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

diff --git 
a/core/src/test/scala/org/apache/spark/scheduler/EventLoggingListenerSuite.scala
 
b/core/src/test/scala/org/apache/spark/scheduler/EventLoggingListenerSuite.scala
index 31db0328f81..edc54e60654 100644
--- 
a/core/src/test/scala/org/apache/spark/scheduler/EventLoggingListenerSuite.scala
+++ 
b/core/src/test/scala/org/apache/spark/scheduler/EventLoggingListenerSuite.scala
@@ -33,7 +33,7 @@ import org.apache.spark.deploy.SparkHadoopUtil
 import org.apache.spark.deploy.history.{EventLogFileReader, 
SingleEventLogFileWriter}
 import org.apache.spark.deploy.history.EventLogTestHelper._
 import org.apache.spark.executor.{ExecutorMetrics, TaskMetrics}
-import org.apache.spark.internal.config.{EVENT_LOG_DIR, EVENT_LOG_ENABLED}
+import org.apache.spark.internal.config.{EVENT_LOG_COMPRESS, EVENT_LOG_DIR, 
EVENT_LOG_ENABLED}
 import org.apache.spark.io._
 import org.apache.spark.metrics.{ExecutorMetricType, MetricsSystem}
 import org.apache.spark.resource.ResourceProfile
@@ -163,6 +163,7 @@ class EventLoggingListenerSuite extends SparkFunSuite with 
LocalSparkContext wit
   test("SPARK-31764: isBarrier should be logged in event log") {
     val conf = new SparkConf()
     conf.set(EVENT_LOG_ENABLED, true)
+    conf.set(EVENT_LOG_COMPRESS, false)
     conf.set(EVENT_LOG_DIR, testDirPath.toString)
     val sc = new SparkContext("local", "test-SPARK-31764", conf)
     val appId = sc.applicationId


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to