Hisoka-X commented on code in PR #40865: URL: https://github.com/apache/spark/pull/40865#discussion_r1173210161
########## sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/optimizer/subquery.scala: ########## @@ -599,10 +600,32 @@ object RewriteCorrelatedScalarSubquery extends Rule[LogicalPlan] with AliasHelpe if (Utils.isTesting) { assert(mayHaveCountBug.isDefined) } + + def queryOutputFoldable(list: Seq[NamedExpression]): Boolean = { + trimAliases(list.filter(p => p.exprId.equals(query.output.head.exprId)).head).foldable + } + + lazy val resultFoldable = { + query match { + case Project(expressions, _) => + queryOutputFoldable(expressions) + case Aggregate(_, expressions, _) => + queryOutputFoldable(expressions) + case _ => + false + } + } + if (resultWithZeroTups.isEmpty) { // CASE 1: Subquery guaranteed not to have the COUNT bug because it evaluates to NULL // with zero tuples. planWithoutCountBug + } else if (resultFoldable) { Review Comment: Already merged before create this PR. > if the subquery is something like `select false from ... group by c` then it will still actually return null on empty inputs. That's why we should judge whether the column returned by subquery is foldable. If the result depends on input, the `resultFoldable` will be `false`. It will skip the logic I created. > I think your fix will fix the COUNT(*) is null case, but break the false case, e.g. in a query like select *, (select any_value(false) as result from t1 where t0.a = t1.c) from t0) Can you confirm this? Is this answer right? <img width="952" alt="image" src="https://user-images.githubusercontent.com/32387433/233515508-8a7055f6-7366-413c-b137-2bdcbbabb0a1.png"> -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org