Github user tejasapatil commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16909#discussion_r105093397
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/window/WindowFunctionFrame.scala
 ---
    @@ -341,25 +364,27 @@ private[window] final class 
UnboundedFollowingWindowFunctionFrame(
       override def write(index: Int, current: InternalRow): Unit = {
         var bufferUpdated = index == 0
     
    -    // Duplicate the input to have a new iterator
    -    val tmp = input.copy()
    -
    -    // Drop all rows from the buffer for which the input row value is 
smaller than
    +    // Ignore all the rows from the buffer for which the input row value 
is smaller than
         // the output row lower bound.
    -    tmp.skip(inputIndex)
    -    var nextRow = tmp.next()
    +    val iterator = input.generateIterator(startIndex = inputIndex)
    +
    +    def getNextOrNull(iterator: Iterator[UnsafeRow]): UnsafeRow = {
    --- End diff --
    
    @hvanhovell @davies : In case of windowing, there are no nulls expected so 
`null` is being used as an indicator for no more data left to be read. Earlier 
this was being done in `RowBuffer.next` [0] which this PR gets rid of (notice 
that its not throwing any exception once the end is reached). Adding this as a 
method in iterator class might be bad given this assumption specific to 
windowing usage (ie. no nulls).
    
    I have created a static method within `WindowFunctionFrame` so that there 
would be sharing amongst all the windowing frame implementations.
    
    [0] : 
https://github.com/apache/spark/blob/417140e441505f20eb5bd4943ce216c3ec6adc10/sql/core/src/main/scala/org/apache/spark/sql/execution/window/RowBuffer.scala#L37


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to