In Structured Streaming, there's the notion of event-time windowing:


However, this is not quite similar to DStream's windowing operations: in
Structured Streaming, windowing groups the data by fixed time-windows, and
every event in a time window is associated to its group:


And in DStreams it just outputs all the data according to a limited window
in time (last 10 minutes for example).

The question was asked also  here
<https://stackoverflow.com/questions/49821646/is-there-someway-to-do-the-eqivalent-of-reducebykeyandwindow-in-spark-structured>
 
, if it makes it clearer.

How the latter can be achieved in Structured Streaming?



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to