This is an automated email from the ASF dual-hosted git repository. gengliang pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push: new 8a637c8 [SPARK-33084][TEST][FOLLOWUP] Fix a flaky test in SparkContextSuite 8a637c8 is described below commit 8a637c80c67fdffaaf873e047a3a133c4e2bc16c Author: Gengliang Wang <gengli...@apache.org> AuthorDate: Wed Jan 5 15:51:09 2022 +0800 [SPARK-33084][TEST][FOLLOWUP] Fix a flaky test in SparkContextSuite ### What changes were proposed in this pull request? The test case `SPARK-33084: Add jar support Ivy URI -- transitive=true will download dependency jars` in `SparkContextSuite` is becoming flaky: - https://github.com/gengliangwang/spark/runs/4698825652?check_suite_focus=true - https://github.com/gengliangwang/spark/runs/4698331067?check_suite_focus=true - https://github.com/AngersZhuuuu/spark/runs/4697626841?check_suite_focus=true The reason is that some of the events in `LogAppender` are null so that there is NPE: ``` [info] Cause: java.lang.NullPointerException: [info] at org.apache.spark.SparkContextSuite.$anonfun$new$128(SparkContextSuite.scala:1077) [info] at org.apache.spark.SparkContextSuite.$anonfun$new$128$adapted(SparkContextSuite.scala:1077) ``` This PR is to fix the issue to unblock PR builders. ### Why are the changes needed? Fix flaky test ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? Just tests Closes #35098 from gengliangwang/fixFlakyTest. Authored-by: Gengliang Wang <gengli...@apache.org> Signed-off-by: Gengliang Wang <gengli...@apache.org> --- core/src/test/scala/org/apache/spark/SparkFunSuite.scala | 8 +++++--- 1 file changed, 5 insertions(+), 3 deletions(-) diff --git a/core/src/test/scala/org/apache/spark/SparkFunSuite.scala b/core/src/test/scala/org/apache/spark/SparkFunSuite.scala index 5be0c96..273ffeb 100644 --- a/core/src/test/scala/org/apache/spark/SparkFunSuite.scala +++ b/core/src/test/scala/org/apache/spark/SparkFunSuite.scala @@ -266,23 +266,25 @@ abstract class SparkFunSuite class LogAppender(msg: String = "", maxEvents: Int = 1000) extends AbstractAppender("logAppender", null, null) { - val loggingEvents = new ArrayBuffer[LogEvent]() + private val _loggingEvents = new ArrayBuffer[LogEvent]() private var _threshold: Level = Level.INFO override def append(loggingEvent: LogEvent): Unit = loggingEvent.synchronized { val copyEvent = loggingEvent.toImmutable if (copyEvent.getLevel.isMoreSpecificThan(_threshold)) { - if (loggingEvents.size >= maxEvents) { + if (_loggingEvents.size >= maxEvents) { val loggingInfo = if (msg == "") "." else s" while logging $msg." throw new IllegalStateException( s"Number of events reached the limit of $maxEvents$loggingInfo") } - loggingEvents.append(copyEvent) + _loggingEvents.append(copyEvent) } } def setThreshold(threshold: Level): Unit = { _threshold = threshold } + + def loggingEvents: ArrayBuffer[LogEvent] = _loggingEvents.filterNot(_ == null) } } --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org