WweiL commented on code in PR #44076: URL: https://github.com/apache/spark/pull/44076#discussion_r1409957775
########## sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/StreamingSymmetricHashJoinExec.scala: ########## @@ -637,18 +653,22 @@ case class StreamingSymmetricHashJoinExec( thisRow: UnsafeRow, subIter: Iterator[InternalRow]) extends CompletionIterator[InternalRow, Iterator[InternalRow]](subIter) { - + // scalastyle:off private val iteratorNotEmpty: Boolean = super.hasNext override def completion(): Unit = { val isLeftSemiWithMatch = joinType == LeftSemi && joinSide == LeftSide && iteratorNotEmpty // Add to state store only if both removal predicates do not match, // and the row is not matched for left side of left semi join. + println(s"!stateKeyWatermarkPredicateFunc(key): ${!stateKeyWatermarkPredicateFunc(key)}" + + s" !stateValueWatermarkPredicateFunc(thisRow): ${!stateValueWatermarkPredicateFunc(thisRow)}") val shouldAddToState = !stateKeyWatermarkPredicateFunc(key) && !stateValueWatermarkPredicateFunc(thisRow) && !isLeftSemiWithMatch if (shouldAddToState) { + println(s"wei==add to state: $thisRow") Review Comment: If comment out the if (shoudAddToState) above, then there is data: ``` [info] - SPARK-45637 window agg + window agg -> join on window, append mode *** FAILED *** (6 seconds, 953 milliseconds) [info] == Results == [info] !== Correct Answer - 1 == == Spark Answer - 1 == [info] !struct<_1:int,_2:int,_3:int,_4:int> struct<window:struct<start:timestamp,end:timestamp>,count:bigint,count:bigint> [info] ![0,5,5,1] [[1969-12-31 16:00:00.0,1969-12-31 16:00:05.0],1,5] [info] ``` My [0, 5, 5, 1] answer is badly written but the idea is the same as the spark answer -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org