Hello,

I would appreciate insights on the following questions:

1) Using Spark Streaming, I would like to keep windowed statistics
for the past 30, 60 and 120 minutes.
Is there an integrated/better way of doing this than creating three
separate windows and pointing them to the same DStream?

2) On each slide interval, the function I passed to foreachRDD is
called with all the data belonging to the corresponding time window.
The result of the function's processing would ideally be a "table" of
rows (keys) and 10 columns of values. Is the table-like format possible,
or I should split each of the 10 values into a separate list of
(key, value) pairs?

3) The processed "table" (or separate lists of (key, value) pairs) should
be made available to other Spark applications.
Is there an integrated way of doing this (some form of shared memory?)
or it should be handled in a custom/manual fashion?

Thanks,
Jar.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to