hi guys:
I got a question when reading
http://spark.apache.org/docs/latest/streaming-programming-guide.html#setting-the-right-batch-interval.
What will happen to the streaming data if the batch processing time is
bigger than the batch interval? Will the next batch data be dalayed to process
or the unfinished processing job to be discarded?
thanks for any ideas shared?-------------------------------- Thanks&Best regards! 罗辉 San.Luo
