sarutak commented on a change in pull request #34593:
URL: https://github.com/apache/spark/pull/34593#discussion_r755696900



##########
File path: sql/core/src/main/scala/org/apache/spark/sql/functions.scala
##########
@@ -2197,13 +2197,23 @@ object functions {
   def round(e: Column): Column = round(e, 0)
 
   /**
-   * Round the value of `e` to `scale` decimal places with HALF_UP round mode
-   * if `scale` is greater than or equal to 0 or at integral part when `scale` 
is less than 0.
+   * Returns the value of the column `e` rounded to 0 decimal places with 
HALF_UP round mode.
    *
    * @group math_funcs
    * @since 1.5.0
    */
-  def round(e: Column, scale: Int): Column = withExpr { Round(e.expr, 
Literal(scale)) }
+  def round(e: Column, scale: Int): Column = round(e, scale, "half_up")
+
+  /**
+   * Round the value of `e` to `scale` decimal places with given round mode, 
default: HALF_UP
+   * if `scale` is greater than or equal to 0 or at integral part when `scale` 
is less than 0.
+   *
+   * @group math_funcs
+   * @since 3.3.0
+   */
+  def round(e: Column, scale: Int, mode: String): Column = withExpr {

Review comment:
       I think, it's better to make the arguments be `Column` for the 
overloaded one.
   What do you think @HyukjinKwon ?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to