[jira] [Commented] (SPARK-35520) Spark-SQL test fails on IBM Z for certain config combinations.
[ https://issues.apache.org/jira/browse/SPARK-35520?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17575254#comment-17575254 ] Vivian Kong commented on SPARK-35520: - The test is still failing on Spark v3.3.0 on IBM Z. We appreciate any pointers community can share so we can look into it. Thanks. > Spark-SQL test fails on IBM Z for certain config combinations. > -- > > Key: SPARK-35520 > URL: https://issues.apache.org/jira/browse/SPARK-35520 > Project: Spark > Issue Type: Bug > Components: Spark Core, SQL >Affects Versions: 3.1.1 >Reporter: Simrit Kaur >Priority: Major > > Some queries of SQL related test cases: in-joins.sql, in-order-by.sql, > not-in-group-by.sql and SubquerySuite.scala are failing with specific > configuration combinations on IBM Z(s390x). > For example: > sql("select * from l where a = 6 and a not in (select c from r where c is not > null)") query from SubquerySuite.scala fails for following config > combinations: > |enableNAAJ|enableAQE|enableCodegen| > |TRUE|FALSE|FALSE| > |TRUE|TRUE|FALSE| > The above combination is also causing 2 other queries in in-joins.sql and > in-order-by.sql failing. > Another query: > SELECT Count(*) > FROM (SELECT * > FROM t2 > WHERE t2a NOT IN (SELECT t3a > FROM t3 > WHERE t3h != t2h)) t2 > WHERE t2b NOT IN (SELECT Min(t2b) > FROM t2 > WHERE t2b = t2b > GROUP BY t2c); > from not-in-group-by.sql is failing for following combinations: > |enableAQE|enableCodegen| > |FALSE|TRUE| > |FALSE|FALSE| > > These Test cases are not failing for 3.0.1 release and I believe might have > been introduced with > [SPARK-32290|https://issues.apache.org/jira/browse/SPARK-32290] . > There is another strange behaviour observed, if expected output is 1,3 , I am > getting 1, 3, 9. If I update the Golden file to expect 1, 3, 9, the output > will be 1, 3. -- This message was sent by Atlassian Jira (v8.20.10#820010) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-35520) Spark-SQL test fails on IBM Z for certain config combinations.
[ https://issues.apache.org/jira/browse/SPARK-35520?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17425205#comment-17425205 ] Kun Lu commented on SPARK-35520: I've also observed this issue on Spark v3.1.2 on IBM Z. Any comments from the community would be greatly appreciated. > Spark-SQL test fails on IBM Z for certain config combinations. > -- > > Key: SPARK-35520 > URL: https://issues.apache.org/jira/browse/SPARK-35520 > Project: Spark > Issue Type: Bug > Components: Spark Core, SQL >Affects Versions: 3.1.1 >Reporter: Simrit Kaur >Priority: Major > > Some queries of SQL related test cases: in-joins.sql, in-order-by.sql, > not-in-group-by.sql and SubquerySuite.scala are failing with specific > configuration combinations on IBM Z(s390x). > For example: > sql("select * from l where a = 6 and a not in (select c from r where c is not > null)") query from SubquerySuite.scala fails for following config > combinations: > |enableNAAJ|enableAQE|enableCodegen| > |TRUE|FALSE|FALSE| > |TRUE|TRUE|FALSE| > The above combination is also causing 2 other queries in in-joins.sql and > in-order-by.sql failing. > Another query: > SELECT Count(*) > FROM (SELECT * > FROM t2 > WHERE t2a NOT IN (SELECT t3a > FROM t3 > WHERE t3h != t2h)) t2 > WHERE t2b NOT IN (SELECT Min(t2b) > FROM t2 > WHERE t2b = t2b > GROUP BY t2c); > from not-in-group-by.sql is failing for following combinations: > |enableAQE|enableCodegen| > |FALSE|TRUE| > |FALSE|FALSE| > > These Test cases are not failing for 3.0.1 release and I believe might have > been introduced with > [SPARK-32290|https://issues.apache.org/jira/browse/SPARK-32290] . > There is another strange behaviour observed, if expected output is 1,3 , I am > getting 1, 3, 9. If I update the Golden file to expect 1, 3, 9, the output > will be 1, 3. -- This message was sent by Atlassian Jira (v8.3.4#803005) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org