If you are trying to keep such long term state, it will be more robust in the long term to use a dedicated data store (cassandra/HBase/etc.) that is designed for long term storage.
On Tue, Jul 28, 2015 at 4:37 PM, swetha <swethakasire...@gmail.com> wrote: > > > Hi TD, > > We have a requirement to maintain the user session state and to > maintain/update the metrics for minute, day and hour granularities for a > user session in our Streaming job. Can I keep those granularities in the > state and recalculate each time there is a change? How would the > performance > be impacted? > > > Thanks, > Swetha > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-Json-file-groupby-function-tp9618p24041.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > >