[ 
https://issues.apache.org/jira/browse/SPARK-7689?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15075579#comment-15075579
 ] 

Apache Spark commented on SPARK-7689:
-------------------------------------

User 'JoshRosen' has created a pull request for this issue:
https://github.com/apache/spark/pull/10534

> Remove TTL-based metadata cleaning (spark.cleaner.ttl)
> ------------------------------------------------------
>
>                 Key: SPARK-7689
>                 URL: https://issues.apache.org/jira/browse/SPARK-7689
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>            Reporter: Josh Rosen
>            Assignee: Josh Rosen
>
> With the introduction of ContextCleaner, I think there's no longer any reason 
> for most users to enable the MetadataCleaner / {{spark.cleaner.ttl}} (except 
> perhaps for super-long-lived Spark REPLs where you're worried about orphaning 
> RDDs or broadcast variables in your REPL history and having them never get 
> cleaned up, although I think this is an uncommon use-case).  I think that 
> this property used to be relevant for Spark Streaming jobs, but I think 
> that's no longer the case since the latest Streaming docs have removed all 
> mentions of {{spark.cleaner.ttl}} (see 
> https://github.com/apache/spark/pull/4956/files#diff-dbee746abf610b52d8a7cb65bf9ea765L1817,
>  for example).
> See 
> http://apache-spark-user-list.1001560.n3.nabble.com/is-spark-cleaner-ttl-safe-td2557.html
>  for an old, related discussion.  Also, see 
> https://github.com/apache/spark/pull/126, the PR that introduced the new 
> ContextCleaner mechanism.
> For Spark 2.0, I think we should remove {{spark.cleaner.ttl}} and the 
> associated TTL-based metadata cleaning code.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to