[GitHub] spark pull request #13061: [SPARK-14279] [Build] Pick the spark version from...
Github user asfgit closed the pull request at: https://github.com/apache/spark/pull/13061 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #13061: [SPARK-14279] [Build] Pick the spark version from...
Github user vanzin commented on a diff in the pull request: https://github.com/apache/spark/pull/13061#discussion_r65750838 --- Diff: core/src/main/scala/org/apache/spark/package.scala --- @@ -41,7 +41,53 @@ package org.apache * level interfaces. These are subject to changes or removal in minor releases. */ +import java.util.Properties + package object spark { - // For package docs only - val SPARK_VERSION = "2.0.0-SNAPSHOT" + + private object SparkBuildInfo { +val resourceStream = Thread.currentThread().getContextClassLoader. --- End diff -- You can still declare it before the try. Just create a block: ``` val (...) = { val stream = ... try { ... ``` --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #13061: [SPARK-14279] [Build] Pick the spark version from...
Github user dhruve commented on a diff in the pull request: https://github.com/apache/spark/pull/13061#discussion_r65750639 --- Diff: core/src/main/scala/org/apache/spark/package.scala --- @@ -41,7 +41,53 @@ package org.apache * level interfaces. These are subject to changes or removal in minor releases. */ +import java.util.Properties + package object spark { - // For package docs only - val SPARK_VERSION = "2.0.0-SNAPSHOT" + + private object SparkBuildInfo { +val resourceStream = Thread.currentThread().getContextClassLoader. --- End diff -- Wanted to avoid using a var so defined it outside the block. Or else it wouldn't be accessible in finally. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #13061: [SPARK-14279] [Build] Pick the spark version from...
Github user vanzin commented on a diff in the pull request: https://github.com/apache/spark/pull/13061#discussion_r65748365 --- Diff: core/src/main/scala/org/apache/spark/package.scala --- @@ -41,7 +41,53 @@ package org.apache * level interfaces. These are subject to changes or removal in minor releases. */ +import java.util.Properties + package object spark { - // For package docs only - val SPARK_VERSION = "2.0.0-SNAPSHOT" + + private object SparkBuildInfo { +val resourceStream = Thread.currentThread().getContextClassLoader. --- End diff -- Might also be a good idea to explicitly check that `resourceStream` is not null (instead of throwing an NPE). Shouldn't ever happen, but well. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #13061: [SPARK-14279] [Build] Pick the spark version from...
Github user vanzin commented on a diff in the pull request: https://github.com/apache/spark/pull/13061#discussion_r65747950 --- Diff: core/src/main/scala/org/apache/spark/package.scala --- @@ -41,7 +41,53 @@ package org.apache * level interfaces. These are subject to changes or removal in minor releases. */ +import java.util.Properties + package object spark { - // For package docs only - val SPARK_VERSION = "2.0.0-SNAPSHOT" + + private object SparkBuildInfo { +val resourceStream = Thread.currentThread().getContextClassLoader. --- End diff -- Ah, missed this before, this should also be inside the block so that it can be garbage collected. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #13061: [SPARK-14279] [Build] Pick the spark version from...
Github user vanzin commented on a diff in the pull request: https://github.com/apache/spark/pull/13061#discussion_r65747983 --- Diff: core/src/main/scala/org/apache/spark/package.scala --- @@ -41,7 +41,53 @@ package org.apache * level interfaces. These are subject to changes or removal in minor releases. */ +import java.util.Properties + package object spark { - // For package docs only - val SPARK_VERSION = "2.0.0-SNAPSHOT" + + private object SparkBuildInfo { +val resourceStream = Thread.currentThread().getContextClassLoader. + getResourceAsStream("spark-version-info.properties") + +val ( + spark_version: String, + spark_branch: String, --- End diff -- These arguments need to be indented one extra level. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #13061: [SPARK-14279] [Build] Pick the spark version from...
Github user dhruve commented on a diff in the pull request: https://github.com/apache/spark/pull/13061#discussion_r65643828 --- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala --- @@ -103,6 +104,9 @@ object SparkSubmit { /___/ .__/\_,_/_/ /_/\_\ version %s /_/ """.format(SPARK_VERSION)) +printStream.println("Branch %s".format(SPARK_BRANCH)) +printStream.println("Compiled by user %s on %s".format(SPARK_BUILD_USER, SPARK_BUILD_DATE)) +printStream.println("Url %s".format(SPARK_REPO_URL)) --- End diff -- I had checked on this earlier. Currently the information is displayed at multiple sources - SparkSubmit, REPLs for 2.10 and 2.11, Python and R shell's. I personally would prefer to keep this consistent. However I see the welcome message being duplicated at 3 places in the scala code itself with some minor differences in the messages shown. Refactoring them into a single location is something that we should do in the ideal case. Since the focus was to correctly identify the spark versions running on "real" clusters, we limited the change to SparkSubmit as users aren't allowed to fire up spark shells on the cluster. @jodersky, @vanzin let me know if you guys want it to have it consistently across all the messages. Or we could just accept this one now and file another JIRA for the refactoring. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #13061: [SPARK-14279] [Build] Pick the spark version from...
Github user dhruve commented on a diff in the pull request: https://github.com/apache/spark/pull/13061#discussion_r65643378 --- Diff: core/src/main/scala/org/apache/spark/package.scala --- @@ -41,7 +41,53 @@ package org.apache * level interfaces. These are subject to changes or removal in minor releases. */ +import java.util.Properties + package object spark { - // For package docs only - val SPARK_VERSION = "2.0.0-SNAPSHOT" + + private object SparkBuildInfo { +val resourceStream = Thread.currentThread().getContextClassLoader. + getResourceAsStream("spark-version-info.properties") + +val ( + spark_version: String, --- End diff -- We are using all caps because these are constants and the underscores to improve the readability. More or less I have tried to be consistent following how SPARK_VERSION is named and used throughout. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #13061: [SPARK-14279] [Build] Pick the spark version from...
Github user jodersky commented on a diff in the pull request: https://github.com/apache/spark/pull/13061#discussion_r65465157 --- Diff: core/src/main/scala/org/apache/spark/package.scala --- @@ -41,7 +41,53 @@ package org.apache * level interfaces. These are subject to changes or removal in minor releases. */ +import java.util.Properties + package object spark { - // For package docs only - val SPARK_VERSION = "2.0.0-SNAPSHOT" + + private object SparkBuildInfo { +val resourceStream = Thread.currentThread().getContextClassLoader. + getResourceAsStream("spark-version-info.properties") + +val ( + spark_version: String, --- End diff -- oh, this is actually local, so no big deal. If you get a chance to fix it during conflict resolution it would be great though --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #13061: [SPARK-14279] [Build] Pick the spark version from...
Github user jodersky commented on a diff in the pull request: https://github.com/apache/spark/pull/13061#discussion_r65464868 --- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala --- @@ -103,6 +104,9 @@ object SparkSubmit { /___/ .__/\_,_/_/ /_/\_\ version %s /_/ """.format(SPARK_VERSION)) +printStream.println("Branch %s".format(SPARK_BRANCH)) +printStream.println("Compiled by user %s on %s".format(SPARK_BUILD_USER, SPARK_BUILD_DATE)) +printStream.println("Url %s".format(SPARK_REPO_URL)) --- End diff -- It would be great to also add the build info to org.apache.spark.repl.SparkILoop --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #13061: [SPARK-14279] [Build] Pick the spark version from...
Github user jodersky commented on a diff in the pull request: https://github.com/apache/spark/pull/13061#discussion_r65464743 --- Diff: core/src/main/scala/org/apache/spark/package.scala --- @@ -41,7 +41,53 @@ package org.apache * level interfaces. These are subject to changes or removal in minor releases. */ +import java.util.Properties + package object spark { - // For package docs only - val SPARK_VERSION = "2.0.0-SNAPSHOT" + + private object SparkBuildInfo { +val resourceStream = Thread.currentThread().getContextClassLoader. + getResourceAsStream("spark-version-info.properties") + +val ( + spark_version: String, --- End diff -- is this purposefully using underscores? `sparkVersion` is consistent with the general style --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org