In general working out minimum or max of say prices (I do not know your use
case) is pretty straight forward.
For example
val maxValue = price.reduceByWindow((x:Double,y:Double) => if(x > y) x else
y,Seconds(windowLength), Seconds(slidingInterval))
maxValue.print()
The average values are
Hi Matthias,
Say with the following
you have
"Batch interval" is the basic interval at which the system with receive the
data in batches.
val ssc = new StreamingContext(sparkConf, Seconds(n))
// window length - The duration of the window below that must be multiple
of batch interval n in = >
Hi,
If i want to have a sliding average over the 10 minutes for some keys I can
do something like
groupBy(window(…),“my-key“).avg(“some-values“) in Spark 2.0
I try to implement this sliding average using Spark 1.6.x:
I tried with reduceByKeyAndWindow but it did not find a solution. Imo i
have to