[ 
https://issues.apache.org/jira/browse/SPARK-7689?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Josh Rosen updated SPARK-7689:
------------------------------
    Assignee:     (was: Josh Rosen)

I don't have time to work on this now, so it would be great if someone else 
wants to take over and implement the internal System.gc() timer / call outlined 
at https://github.com/apache/spark/pull/6220#issuecomment-103627537

> Deprecate spark.cleaner.ttl
> ---------------------------
>
>                 Key: SPARK-7689
>                 URL: https://issues.apache.org/jira/browse/SPARK-7689
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>            Reporter: Josh Rosen
>
> With the introduction of ContextCleaner, I think there's no longer any reason 
> for most users to enable the MetadataCleaner / {{spark.cleaner.ttl}} (except 
> perhaps for super-long-lived Spark REPLs where you're worried about orphaning 
> RDDs or broadcast variables in your REPL history and having them never get 
> cleaned up, although I think this is an uncommon use-case).  I think that 
> this property used to be relevant for Spark Streaming jobs, but I think 
> that's no longer the case since the latest Streaming docs have removed all 
> mentions of {{spark.cleaner.ttl}} (see 
> https://github.com/apache/spark/pull/4956/files#diff-dbee746abf610b52d8a7cb65bf9ea765L1817,
>  for example).
> See 
> http://apache-spark-user-list.1001560.n3.nabble.com/is-spark-cleaner-ttl-safe-td2557.html
>  for an old, related discussion.  Also, see 
> https://github.com/apache/spark/pull/126, the PR that introduced the new 
> ContextCleaner mechanism.
> We should probably add a deprecation warning to {{spark.cleaner.ttl}} that 
> advises users against using it, since it's an unsafe configuration option 
> that can lead to confusing behavior if it's misused.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to