Github user dhruve commented on a diff in the pull request:

    https://github.com/apache/spark/pull/13061#discussion_r63410648
  
    --- Diff: core/src/main/scala/org/apache/spark/package.scala ---
    @@ -41,7 +41,50 @@ package org.apache
      * level interfaces. These are subject to changes or removal in minor 
releases.
      */
     
    +import java.util.Properties
    +
    +import org.apache.spark.internal.Logging
    +
     package object spark {
    -  // For package docs only
    -  val SPARK_VERSION = "2.0.0-SNAPSHOT"
    +
    +  object SparkBuildInfo extends Logging {
    --- End diff --
    
    - Yes, the object is for a reason, its because of how internally Scala uses 
pattern matching. 
    
    - You are right, that Scala doesn't have static initialization blocks like 
Java. And a nice/preferred way to construct and initialize the constants would 
be like what you showed which I did first while loading the values, however 
Scala compiler treats all cap variable names in this case as a class and not a 
variable and this is not obvious from the error message it throws. 
    
    - You could test it in Scala REPL
    ```scala
    // this one works fine
    val (one, two) = ("one", "two")
    
    // this one does not
    val (THREE, FOUR) = ("three", "four")
    ```
    
    - Making the object private makes sense. I will change that.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to