So , at any point does a stream stop producing RDDs ? If not, is there a possibility, if the batching isnt working or is broken, that your disk / RAM will fill up to the brim w/ unprocessed RDD backlog?
On Fri, Dec 19, 2014 at 1:29 PM, Silvio Fiorito < silvio.fior...@granturing.com> wrote: > > Batches will wait for the previous batch to finish. The monitoring > console will show you the backlog of waiting batches. > > From: Asim Jalis <asimja...@gmail.com> > Date: Friday, December 19, 2014 at 1:16 PM > To: user <user@spark.apache.org> > Subject: Spark Streaming Threading Model > > Q: In Spark Streaming if your DStream transformation and output action > take longer than the batch duration will the system process the next batch > in another thread? Or will it just wait until the first batch’s RDD is > processed? In other words does it build up a queue of buffered RDDs > awaiting processing or does it just process them? > > Asim > -- jay vyas