Hi all,

 I have read the below in the documentation : 

"To maximize throughput, set setBufferTimeout(-1) which will remove the
timeout and buffers will only be flushed when they are full. To minimize
latency, set the timeout to a value close to 0 (for example 5 or 10 ms). A
buffer timeout of 0 should be avoided, because it can cause severe
performance degradation."


why a 0 BufferTimeout  cause severe performance degradation, shouldnt it
provide min latency, what is meant by perf. degradation there. On the
otherhand, can we say  that min latency is  always >  BufferTimeout. 

Best, 



--
Sent from: http://apache-flink-user-mailing-list-archive.2336050.n4.nabble.com/

Reply via email to