[ 
https://issues.apache.org/jira/browse/SPARK-28594?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17127788#comment-17127788
 ] 

Jungtaek Lim commented on SPARK-28594:
--------------------------------------

Unfortunately that is most probably the guaranteed way if you're suffering some 
issue with event log in streaming application. I see some other tricky 
alternatives as well, like periodically stop the application and remove/move 
the event log and restart the application, but yes it makes you feel odd, have 
to have downtime a bit just due to event log.

> Allow event logs for running streaming apps to be rolled over
> -------------------------------------------------------------
>
>                 Key: SPARK-28594
>                 URL: https://issues.apache.org/jira/browse/SPARK-28594
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 3.0.0
>            Reporter: Stephen Levett
>            Assignee: Jungtaek Lim
>            Priority: Major
>              Labels: releasenotes
>             Fix For: 3.0.0
>
>
> At all current Spark releases when event logging on spark streaming is 
> enabled the event logs grow massively.  The files continue to grow until the 
> application is stopped or killed.
> The Spark history server then has difficulty processing the files.
> https://issues.apache.org/jira/browse/SPARK-8617
> Addresses .inprogress files but not event log files that are still running.
> Identify a mechanism to set a "max file" size so that the file is rolled over 
> when it reaches this size?
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to