Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/12327#discussion_r59605580
  
    --- Diff: core/src/main/scala/org/apache/spark/util/ClosureCleaner.scala ---
    @@ -233,16 +217,22 @@ private[spark] object ClosureCleaner extends Logging {
         // Note that all outer objects but the outermost one (first one in 
this list) must be closures
         var outerPairs: List[(Class[_], AnyRef)] = (outerClasses zip 
outerObjects).reverse
         var parent: AnyRef = null
    -    if (outerPairs.size > 0 && !isClosure(outerPairs.head._1)) {
    -      // The closure is ultimately nested inside a class; keep the object 
of that
    -      // class without cloning it since we don't want to clone the user's 
objects.
    -      // Note that we still need to keep around the outermost object 
itself because
    -      // we need it to clone its child closure later (see below).
    -      logDebug(s" + outermost object is not a closure, so do not clone it: 
${outerPairs.head}")
    -      parent = outerPairs.head._2 // e.g. SparkContext
    -      outerPairs = outerPairs.tail
    -    } else if (outerPairs.size > 0) {
    -      logDebug(s" + outermost object is a closure, so we just keep it: 
${outerPairs.head}")
    +    if (outerPairs.size > 0) {
    +      if (isClosure(outerPairs.head._1)) {
    +        logDebug(s" + outermost object is a closure, so we just keep it: 
${outerPairs.head}")
    +      } else if (outerPairs.head._1.getName.startsWith("$line")) {
    --- End diff --
    
    can you add a short comment in this `if` case describing why we treat REPL 
classes as a special case (and include `SPARK-14558`)


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to