Github user zsxwing commented on the pull request:

    https://github.com/apache/spark/pull/5277#issuecomment-87969443
  
    Before your change, the java interface of SparkContext$ contains:
    ```
    public final class org.apache.spark.SparkContext$ extends java.lang.Object 
implements org.apache.spark.Logging{
        public java.lang.Object 
org$apache$spark$SparkContext$$SPARK_CONTEXT_CONSTRUCTOR_LOCK();
    }
    ```
    But after your change, SPARK_CONTEXT_CONSTRUCTOR_LOCK isn't used out of 
`SparkContext$`, so the compiler does not generate 
    ```
    public java.lang.Object 
org$apache$spark$SparkContext$$SPARK_CONTEXT_CONSTRUCTOR_LOCK();
    ```
    
    I think it's OK since this is not a Scala public API.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to