[ 
https://issues.apache.org/jira/browse/FLINK-1239?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14209130#comment-14209130
 ] 

Gábor Hermann commented on FLINK-1239:
--------------------------------------

Hey [~hsaputra], this issue can be reproduced simply by running the 
[IterateExample|https://github.com/apache/incubator-flink/blob/master/flink-addons/flink-streaming/flink-streaming-examples/src/main/java/org/apache/flink/streaming/examples/iteration/IterateExample.java]
 with input size 1000 instead of 100, setting buffer timeout to 0 and removing 
the setting of max wait time. The latter would set the maxiumum wait time for a 
record at head of the iteration. It is used solely for testing purposes (this 
way the process does not need to get killed manually). Not setting this yields 
an iteration that runs infinitely, thus the iteration getting stuck can be 
observed.

> Fix iteration example getting stuck with large input
> ----------------------------------------------------
>
>                 Key: FLINK-1239
>                 URL: https://issues.apache.org/jira/browse/FLINK-1239
>             Project: Flink
>          Issue Type: Bug
>          Components: Streaming
>            Reporter: Gábor Hermann
>            Assignee: Gábor Hermann
>            Priority: Critical
>
> When running the streaming iteration example with buffer timeout set to 0 
> (meaning the StreamRecorWriter gets flushed after every emit in every task), 
> the iteration gets stuck at flushing the output after emitting a record. This 
> happens only on larger number of inputs (eg. 1000 record to iterate on).



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to