hummm, got it. Thank you Akhil.


--------------------------------


 
Thanks&Best regards!
罗辉 San.Luo



----- 原始邮件 -----
发件人:Akhil Das <ak...@sigmoidanalytics.com>
收件人:罗辉 <luohui20...@sina.com>
抄送人:user <user@spark.apache.org>
主题:Re: SparkStreaming batch processing time question
日期:2015年04月01日 14点31分



It will add scheduling delay for the new batch. The new batch data will be 
processed after finish up the previous batch, when the time is too high, 
sometimes it will throw fetch failures as the batch data could get removed from 
memory.



Thanks 
Best Regards

On Wed, Apr 1, 2015 at 11:35 AM, <luohui20...@sina.com> wrote:


hi guys:
          I got a question when reading 
http://spark.apache.org/docs/latest/streaming-programming-guide.html#setting-the-right-batch-interval.
 
         What will happen to the streaming data if the batch processing time is 
bigger than the batch interval? Will the next batch data be dalayed to process 
or the unfinished processing job to be discarded?
         
        thanks for any ideas shared?


--------------------------------


 
Thanks&amp;Best regards!
罗辉 San.Luo

Reply via email to