[ https://issues.apache.org/jira/browse/SPARK-20251?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15962313#comment-15962313 ]
Nan Zhu commented on SPARK-20251: --------------------------------- why this is an invalid report? I have been observing the same behavior recently when I upgrade to Spark 2.1 The basic idea (in my side), an exception thrown from DStream.compute() method should close the app instead of proceeding (as the error handling in Spark Streaming is to release the await lock set in awaitTermination) I am still looking at those threads within Spark Streaming to see what was happening, can we change it back to a valid case and give me more time to investigate? > Spark streaming skips batches in a case of failure > -------------------------------------------------- > > Key: SPARK-20251 > URL: https://issues.apache.org/jira/browse/SPARK-20251 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 2.1.0 > Reporter: Roman Studenikin > > We are experiencing strange behaviour of spark streaming application. > Sometimes it just skips batch in a case of job failure and starts working on > the next one. > We expect it to attempt to reprocess batch, but not to skip it. Is it a bug > or we are missing any important configuration params? > Screenshots from spark UI: > http://pasteboard.co/1oRW0GDUX.png > http://pasteboard.co/1oSjdFpbc.png -- This message was sent by Atlassian JIRA (v6.3.15#6346) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org