[ 
https://issues.apache.org/jira/browse/SPARK-16417?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15376504#comment-15376504
 ] 

BinXu commented on SPARK-16417:
-------------------------------

I may face the same problem.
Do you override a generate block function?[~ren xing]

> spark 1.5.2 receiver store(single-record) with ahead log enabled makes 
> executor crash if there is an exception when BlockGenerator storing block
> ------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-16417
>                 URL: https://issues.apache.org/jira/browse/SPARK-16417
>             Project: Spark
>          Issue Type: Bug
>          Components: Streaming
>    Affects Versions: 1.5.2
>         Environment: spark streaming version 1.5.2.
>            Reporter: ren xing
>
> receiver has the store(single-record) function which actually puts the record 
> to a buffer. One backend thread will periodically search this buffer and 
> generate a block and store this block to spark. If enabled the ahead log, 
> sometimes there be an exception when writing the ahead log. This exception 
> will be caught by the backend thread. However the backend thread just print 
> some message AND EXIT! This means there will be no consumer for the receiver 
> inner buffered records. As time goes on, the executor will be OOM



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to