maropu commented on a change in pull request #32040: URL: https://github.com/apache/spark/pull/32040#discussion_r606513566
########## File path: sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/TimeWindow.scala ########## @@ -27,6 +27,35 @@ import org.apache.spark.sql.catalyst.util.IntervalUtils import org.apache.spark.sql.types._ import org.apache.spark.unsafe.types.UTF8String +@ExpressionDescription( + usage = """ + _FUNC_(time_column, window_duration) - Returns the TimeWindow which contains the given + timestamp, based on the window_duration provided. + + _FUNC_(time_column, window_duration, slide_duration) - Returns the TimeWindow(s) which contain + the given timestamp, based on the window_duration and slide_duration provided. + + _FUNC_(time_column, window_duration, slide_duration, start_time) - Returns the TimeWindow(s) + which contain the given timestamp, based on the window_duration, slide_duration, and start_time + provided. + """, + examples = """ + Examples: + > SELECT _FUNC_(timestamp('2020-04-25 15:49:11.914'), '5 minutes'); + 2020-04-25 15:45:00, 2020-04-25 15:50:00 + + > SELECT _FUNC_(timestamp('2020-04-25 15:49:11.914'), '5 minutes', '2 minutes'); + 2020-04-25 15:45:00, 2020-04-25 15:50:00 + 2020-04-25 15:47:00, 2020-04-25 15:52:00 + 2020-04-25 15:49:00, 2020-04-25 15:54:00 + + > SELECT _FUNC_(timestamp('2020-04-25 15:49:11.914'), '5 minutes', '2 minutes', '30 seconds'); + 2020-04-25 15:44:30, 2020-04-25 15:49:30 + 2020-04-25 15:46:30, 2020-04-25 15:51:30 + 2020-04-25 15:48:30, 2020-04-25 15:53:30 + """, + group = "datetime_funcs", + since = "2.0.0") Review comment: This is not a unevaluable SQL expr, so we don't put `ExpressionDescription` here. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org