It will just queue up the subsequent batches, however if this delay is constant 
you may start losing batches. It can handle spikes in processing time, but if 
you know you're consistently running over your batch duration you either need 
to increase the duration or look at enabling back pressure support. See: 
http://spark.apache.org/docs/latest/configuration.html#spark-streaming (1.5+).

________________________________________
From: pyspark2555 <scet.a...@gmail.com>
Sent: Sunday, January 17, 2016 11:32 AM
To: user@spark.apache.org
Subject: Spark Streaming: BatchDuration and Processing time

Hi,

If BatchDuration is set to 1 second in StreamingContext and the actual
processing time is longer than one second, then how does Spark handle that?

For example, I am receiving a continuous Input stream. Every 1 second (batch
duration), the RDDs will be processed. What if this processing time is
longer than 1 second? What happens in the next batch duration?

Thanks.
Amit



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-BatchDuration-and-Processing-time-tp25986.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to