Re: Maintaining overall cumulative data in Spark Streaming

2015-10-30 Thread Silvio Fiorito
eration? From: Sandeep Giri [mailto:sand...@knowbigdata.com<mailto:sand...@knowbigdata.com>] Sent: Thursday, October 29, 2015 3:09 PM To: user <user@spark.apache.org<mailto:user@spark.apache.org>>; dev <d...@spark.apache.org<mailto:d...@spark.apache.org>> Sub

Re: Maintaining overall cumulative data in Spark Streaming

2015-10-30 Thread Sandeep Giri
Giri [mailto:sand...@knowbigdata.com] >> *Sent:* Thursday, October 29, 2015 3:09 PM >> *To:* user <user@spark.apache.org>; dev <d...@spark.apache.org> >> *Subject:* Maintaining overall cumulative data in Spark Streaming >> >> >> >> Dear All, &g

Maintaining overall cumulative data in Spark Streaming

2015-10-29 Thread Sandeep Giri
Dear All, If a continuous stream of text is coming in and you have to keep publishing the overall word count so far since 0:00 today, what would you do? Publishing the results for a window is easy but if we have to keep aggregating the results, how to go about it? I have tried to keep an

RE: Maintaining overall cumulative data in Spark Streaming

2015-10-29 Thread Silvio Fiorito
and...@knowbigdata.com> Sent: ‎10/‎29/‎2015 6:08 PM To: user<mailto:user@spark.apache.org>; dev<mailto:d...@spark.apache.org> Subject: Maintaining overall cumulative data in Spark Streaming Dear All, If a continuous stream of text is coming in and you have to keep publishing the overall word count

RE: Maintaining overall cumulative data in Spark Streaming

2015-10-29 Thread Sandeep Giri
ent:* Thursday, October 29, 2015 3:09 PM > *To:* user <user@spark.apache.org>; dev <d...@spark.apache.org> > *Subject:* Maintaining overall cumulative data in Spark Streaming > > > > Dear All, > > > > If a continuous stream of text is coming in and you ha