[ https://issues.apache.org/jira/browse/SPARK-14160?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Burak Yavuz updated SPARK-14160: -------------------------------- Description: This JIRA is to track the status regarding event time windowing operations for Continuous queries. The proposition is to add a {code} window(timeColumn, windowDuration, slideDuration, startTime) {code} expression that will bucket time columns into time windows. This expression will be useful in both batch analysis and streaming. With streaming, it will open up the use case for event-time window aggregations. For R, and Python interoperability, we will take windowDuration, slideDuration, startTime as strings and parse interval lengths. was: This JIRA is to track the status regarding event time windowing operations for Continuous queries. The proposition is to add a {code}window{code} expression that will bucket time columns into time windows. This expression will be useful in both batch analysis and streaming. With streaming, it will open up the use case for event-time window aggregations. > Windowing for structured streaming > ---------------------------------- > > Key: SPARK-14160 > URL: https://issues.apache.org/jira/browse/SPARK-14160 > Project: Spark > Issue Type: Sub-task > Components: SQL > Reporter: Burak Yavuz > > This JIRA is to track the status regarding event time windowing operations > for Continuous queries. > The proposition is to add a > {code} > window(timeColumn, windowDuration, slideDuration, startTime) > {code} expression that will bucket time columns into time windows. This > expression will be useful in both batch analysis and streaming. With > streaming, it will open up the use case for event-time window aggregations. > For R, and Python interoperability, we will take windowDuration, > slideDuration, startTime as strings and parse interval lengths. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org