[ https://issues.apache.org/jira/browse/SPARK-39386?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17949732#comment-17949732 ]
Ish Nagy edited comment on SPARK-39386 at 5/6/25 1:59 PM: ---------------------------------------------------------- Hi all, I think I identified an important detail regarding this issue, the key environment parameter seems to be a specific JDK version. If I run this test on top of {{zulu11.31.11-ca-jdk11.0.3-linux_x64}}, it consistently fails for me. None of the other java 11 zulu JDKs could reproduce the problem for me, including the very next release ({{zulu11.33.15-ca-jdk11.0.4-linux_x64}}) I haven't made any effort to repro this on other platforms than linux, and I haven't tried any other jdk distributions yet either. I'm attaching a short repro script and its output, for your reference: [^spark-39386.repro.sh] [^spark-39386.repro.sh_2025-05-06_0607.log] (on ubuntu 24.04) was (Author: JIRAUSER302340): Hi all, I think I identified an important detail regarding this issue, the key environment parameter seems to be a specific JDK version. If I run this test on top of {{zulu11.31.11-ca-jdk11.0.3-linux_x64}}, it consistently fails for me. None of the other java 11 zulu JDKs could reproduce the problem for me, including the very next release ({{zulu11.33.15-ca-jdk11.0.4-linux_x64}}) I haven't made any effort to repro this on other platforms, and I haven't tried any other jdk distributions yet either. I'm attaching a short repro script and its output, for your reference: [^spark-39386.repro.sh] [^spark-39386.repro.sh_2025-05-06_0607.log] (on ubuntu 24.04) > Flaky Test: BloomFilterAggregateQuerySuite > ------------------------------------------ > > Key: SPARK-39386 > URL: https://issues.apache.org/jira/browse/SPARK-39386 > Project: Spark > Issue Type: Bug > Components: SQL, Tests > Affects Versions: 3.3.0 > Reporter: Dongjoon Hyun > Priority: Major > Attachments: spark-39386.repro.sh, > spark-39386.repro.sh_2025-05-06_0607.log > > > During Apache Spark 3.3.0 RC5 tests, I found that this test case is very > flaky in my environment. > {code:java} > [info] - Test bloom_filter_agg and might_contain *** FAILED *** (20 > seconds, 370 milliseconds) > [info] Results do not match for query: > [info] Timezone: > sun.util.calendar.ZoneInfo[id="America/Los_Angeles",offset=-28800000,dstSavings=3600000,useDaylight=true,transitions=185,lastRule=java.util.SimpleTimeZone[id=America/Los_Angeles,offset=-28800000,dstSavings=3600000,useDaylight=true,startYear=0,startMode=3,startMonth=2,startDay=8,startDayOfWeek=1,startTime=7200000,startTimeMode=0,endMode=3,endMonth=10,endDay=1,endDayOfWeek=1,endTime=7200000,endTimeMode=0]] > [info] Timezone Env: > ... > == Results == > [info] !== Correct Answer - 1 == == Spark Answer - 1 == > [info] !struct<> > struct<positive_membership_test:boolean,negative_membership_test:boolean> > [info] ![true,false] [true,true] (QueryTest.scala:244) > [info] org.scalatest.exceptions.TestFailedException: > [info] at > org.scalatest.Assertions.newAssertionFailedException(Assertions.scala:472) > [info] at > org.scalatest.Assertions.newAssertionFailedException$(Assertions.scala:471) > [info] at > org.apache.spark.sql.QueryTest$.newAssertionFailedException(QueryTest.scala:234) > [info] at org.scalatest.Assertions.fail(Assertions.scala:933) > [info] at org.scalatest.Assertions.fail$(Assertions.scala:929) > [info] at org.apache.spark.sql.QueryTest$.fail(QueryTest.scala:234) > [info] at > org.apache.spark.sql.QueryTest$.checkAnswer(QueryTest.scala:244) > [info] at > org.apache.spark.sql.QueryTest.checkAnswer(QueryTest.scala:151) > [info] at > org.apache.spark.sql.QueryTest.checkAnswer(QueryTest.scala:155) > [info] at > org.apache.spark.sql.BloomFilterAggregateQuerySuite.$anonfun$new$4(BloomFilterAggregateQuerySuite.scala:98) > {code} -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org