Github user setjet commented on a diff in the pull request:

    https://github.com/apache/spark/pull/18113#discussion_r150381736
  
    --- Diff: 
sql/core/src/main/scala/org/apache/spark/sql/execution/aggregate/typedaggregators.scala
 ---
    @@ -26,43 +26,64 @@ import org.apache.spark.sql.expressions.Aggregator
     // This file defines internal implementations for aggregators.
     
////////////////////////////////////////////////////////////////////////////////////////////////////
     
    +class TypedSumDouble[IN](val f: IN => Double)
    +  extends Aggregator[IN, java.lang.Double, java.lang.Double] {
    +
    +  override def zero: java.lang.Double = 0.0
    +  override def reduce(b: java.lang.Double, a: IN): java.lang.Double =
    --- End diff --
    
    As discussed  previously the boxing is needed to have appropriate return 
types for min/max. This of course would not be needed if we align it to the 
current (incorrect) return values.
     
    I have bounced back and forth between the return values multiple times now, 
so it might be worthwhile to have some more discussion.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to