MrPowers commented on a change in pull request #31073:
URL: https://github.com/apache/spark/pull/31073#discussion_r553619800



##########
File path: sql/core/src/main/scala/org/apache/spark/sql/functions.scala
##########
@@ -2841,6 +2841,31 @@ object functions {
   // DateTime functions
   
//////////////////////////////////////////////////////////////////////////////////////////////
 
+  /**
+   * Creates a datetime interval
+   *
+   * @param years Number of years
+   * @param months Number of months
+   * @param weeks Number of weeks
+   * @param days Number of days
+   * @param hours Number of hours
+   * @param mins Number of mins
+   * @param secs Number of secs
+   * @return A datetime interval
+   * @group datetime_funcs
+   * @since 3.2.0
+   */
+  def make_interval(
+      years: Column = lit(0),

Review comment:
       @MaxGekk - I [added a 
test](https://github.com/apache/spark/pull/31073/commits/c9492a99ffa8a3ab1380ebde21c1ea200d02c187)
 to demonstrate that `make_interval` can be called from the Java API (with the 
Scala method that has default values) when all 7 arguments are passed to the 
method.  e.g. this Java code works:
   
   ```java
   Column twoYears = make_interval(lit(2), lit(0), lit(0), lit(0), lit(0), 
lit(0), lit(0));
   Dataset<Row> df = spark.createDataFrame(rows, 
schema).withColumn("plus_two_years", col("some_date").plus(twoYears));
   ```
   
   Someone with Java experience should check my result.  If the current 
implementation works with the Java API, then I think it's ok to keep it, 
especially because it allows for much more elegant Scala code.  




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to