Github user andrewor14 commented on the pull request:

    https://github.com/apache/spark/pull/126#issuecomment-37484584
  
    This is not specifically related to your patch, but I think we can remove 
the MetadataCleaner in SparkContext that cleans up persisted RDDs periodically. 
In particular, if an RDD is explicitly persisted by a user, periodically 
cleaning up RDDs may destroy an RDD that is in fact still being used. It seems 
to me that it is the user's responsibility for cleaning up whatever they 
persist in the first place.
    
    (That said, doing it through finalize() is fine because the user has no 
other way of using it.)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

Reply via email to