[GitHub] [spark] AmplabJenkins commented on issue #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
AmplabJenkins commented on issue #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add 
version information to the configuration of Core
URL: https://github.com/apache/spark/pull/27913#issuecomment-599064014
 
 
   Test PASSed.
   Refer to this link for build results (access rights to CI server needed): 
   https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/119797/
   Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] SparkQA removed a comment on issue #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
SparkQA removed a comment on issue #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] 
Add version information to the configuration of Core
URL: https://github.com/apache/spark/pull/27913#issuecomment-599043553
 
 
   **[Test build #119797 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/119797/testReport)**
 for PR 27913 at commit 
[`ad58e0c`](https://github.com/apache/spark/commit/ad58e0cc8934f19eefc441f7977b840dfe850c0d).


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins commented on issue #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
AmplabJenkins commented on issue #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add 
version information to the configuration of Core
URL: https://github.com/apache/spark/pull/27913#issuecomment-599064009
 
 
   Merged build finished. Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] SparkQA commented on issue #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
SparkQA commented on issue #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add 
version information to the configuration of Core
URL: https://github.com/apache/spark/pull/27913#issuecomment-599063750
 
 
   **[Test build #119797 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/119797/testReport)**
 for PR 27913 at commit 
[`ad58e0c`](https://github.com/apache/spark/commit/ad58e0cc8934f19eefc441f7977b840dfe850c0d).
* This patch passes all tests.
* This patch merges cleanly.
* This patch adds no public classes.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins removed a comment on issue #27908: [SPARK-31000] Add ability to set table description via Catalog.createTable()

2020-03-14 Thread GitBox
AmplabJenkins removed a comment on issue #27908: [SPARK-31000] Add ability to 
set table description via Catalog.createTable()
URL: https://github.com/apache/spark/pull/27908#issuecomment-599056843
 
 
   Test PASSed.
   Refer to this link for build results (access rights to CI server needed): 
   https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/119796/
   Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins removed a comment on issue #27908: [SPARK-31000] Add ability to set table description via Catalog.createTable()

2020-03-14 Thread GitBox
AmplabJenkins removed a comment on issue #27908: [SPARK-31000] Add ability to 
set table description via Catalog.createTable()
URL: https://github.com/apache/spark/pull/27908#issuecomment-599056841
 
 
   Merged build finished. Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins commented on issue #27908: [SPARK-31000] Add ability to set table description via Catalog.createTable()

2020-03-14 Thread GitBox
AmplabJenkins commented on issue #27908: [SPARK-31000] Add ability to set table 
description via Catalog.createTable()
URL: https://github.com/apache/spark/pull/27908#issuecomment-599056841
 
 
   Merged build finished. Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins commented on issue #27908: [SPARK-31000] Add ability to set table description via Catalog.createTable()

2020-03-14 Thread GitBox
AmplabJenkins commented on issue #27908: [SPARK-31000] Add ability to set table 
description via Catalog.createTable()
URL: https://github.com/apache/spark/pull/27908#issuecomment-599056843
 
 
   Test PASSed.
   Refer to this link for build results (access rights to CI server needed): 
   https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/119796/
   Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] SparkQA removed a comment on issue #27908: [SPARK-31000] Add ability to set table description via Catalog.createTable()

2020-03-14 Thread GitBox
SparkQA removed a comment on issue #27908: [SPARK-31000] Add ability to set 
table description via Catalog.createTable()
URL: https://github.com/apache/spark/pull/27908#issuecomment-599029994
 
 
   **[Test build #119796 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/119796/testReport)**
 for PR 27908 at commit 
[`a0ad064`](https://github.com/apache/spark/commit/a0ad064aff1b0406fb5fd1dab8cb5b1c2b4140c2).


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] SparkQA commented on issue #27908: [SPARK-31000] Add ability to set table description via Catalog.createTable()

2020-03-14 Thread GitBox
SparkQA commented on issue #27908: [SPARK-31000] Add ability to set table 
description via Catalog.createTable()
URL: https://github.com/apache/spark/pull/27908#issuecomment-599056600
 
 
   **[Test build #119796 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/119796/testReport)**
 for PR 27908 at commit 
[`a0ad064`](https://github.com/apache/spark/commit/a0ad064aff1b0406fb5fd1dab8cb5b1c2b4140c2).
* This patch passes all tests.
* This patch merges cleanly.
* This patch adds no public classes.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392579567
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1282,6 +1356,7 @@ package object config {
 s"shuffle `${SHUFFLE_SERVICE_ENABLED.key}` is enabled), shuffle " +
 "blocks requested from those block managers which are running on the 
same host are read " +
 "from the disk directly instead of being fetched as remote blocks over 
the network.")
+  .version("3.0.0")
 
 Review comment:
   SPARK-30812, commit ID: 
68d7edf9497bea2f73707d32ab55dd8e53088e7c#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392579523
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1238,32 +1306,37 @@ package object config {
 "by using extra memory to detect early corruption. Any IOException 
thrown will cause " +
 "the task to be retried once and if it fails again with same 
exception, then " +
 "FetchFailedException will be thrown to retry previous stage")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(false)
 
   private[spark] val SHUFFLE_SYNC =
 ConfigBuilder("spark.shuffle.sync")
   .doc("Whether to force outstanding writes to disk.")
+  .version("0.8.0")
   .booleanConf
   .createWithDefault(false)
 
   private[spark] val SHUFFLE_UNSAFE_FAST_MERGE_ENABLE =
 ConfigBuilder("spark.shuffle.unsafe.fastMergeEnabled")
   .doc("Whether to perform a fast spill merge.")
+  .version("1.4.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val SHUFFLE_SORT_USE_RADIXSORT =
 ConfigBuilder("spark.shuffle.sort.useRadixSort")
   .doc("Whether to use radix sort for sorting in-memory partition ids. 
Radix sort is much " +
 "faster, but requires additional memory to be reserved memory as 
pointers are added.")
+  .version("2.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val SHUFFLE_MIN_NUM_PARTS_TO_HIGHLY_COMPRESS =
 ConfigBuilder("spark.shuffle.minNumPartitionsToHighlyCompress")
   .internal()
   .doc("Number of partitions to determine if MapStatus should use 
HighlyCompressedMapStatus")
+  .version("2.4.0")
 
 Review comment:
   SPARK-24519, commit ID: 
39dfaf2fd167cafc84ec9cc637c114ed54a331e3#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392579544
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1273,6 +1346,7 @@ package object config {
   .doc("Whether to use the old protocol while doing the shuffle block 
fetching. " +
 "It is only enabled while we need the compatibility in the scenario of 
new Spark " +
 "version job fetching shuffle blocks from old version external shuffle 
service.")
+  .version("3.0.0")
 
 Review comment:
   SPARK-25341, commit ID: 
f725d472f51fb80c6ce1882ec283ff69bafb0de4#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392579504
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1238,32 +1306,37 @@ package object config {
 "by using extra memory to detect early corruption. Any IOException 
thrown will cause " +
 "the task to be retried once and if it fails again with same 
exception, then " +
 "FetchFailedException will be thrown to retry previous stage")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(false)
 
   private[spark] val SHUFFLE_SYNC =
 ConfigBuilder("spark.shuffle.sync")
   .doc("Whether to force outstanding writes to disk.")
+  .version("0.8.0")
   .booleanConf
   .createWithDefault(false)
 
   private[spark] val SHUFFLE_UNSAFE_FAST_MERGE_ENABLE =
 ConfigBuilder("spark.shuffle.unsafe.fastMergeEnabled")
   .doc("Whether to perform a fast spill merge.")
+  .version("1.4.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val SHUFFLE_SORT_USE_RADIXSORT =
 ConfigBuilder("spark.shuffle.sort.useRadixSort")
   .doc("Whether to use radix sort for sorting in-memory partition ids. 
Radix sort is much " +
 "faster, but requires additional memory to be reserved memory as 
pointers are added.")
+  .version("2.0.0")
 
 Review comment:
   SPARK-14724, commit ID: 
e2b5647ab92eb478b3f7b36a0ce6faf83e24c0e5#diff-3eedc75de4787b842477138d8cc7f150


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392579487
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1238,32 +1306,37 @@ package object config {
 "by using extra memory to detect early corruption. Any IOException 
thrown will cause " +
 "the task to be retried once and if it fails again with same 
exception, then " +
 "FetchFailedException will be thrown to retry previous stage")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(false)
 
   private[spark] val SHUFFLE_SYNC =
 ConfigBuilder("spark.shuffle.sync")
   .doc("Whether to force outstanding writes to disk.")
+  .version("0.8.0")
   .booleanConf
   .createWithDefault(false)
 
   private[spark] val SHUFFLE_UNSAFE_FAST_MERGE_ENABLE =
 ConfigBuilder("spark.shuffle.unsafe.fastMergeEnabled")
   .doc("Whether to perform a fast spill merge.")
+  .version("1.4.0")
 
 Review comment:
   SPARK-7081, commit ID: 
c53ebea9db418099df50f9adc1a18cee7849cd97#diff-642ce9f439435408382c3ac3b5c5e0a0


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392579418
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1187,48 +1247,56 @@ package object config {
   .internal()
   .doc("Initial threshold for the size of a collection before we start 
tracking its " +
 "memory usage.")
+  .version("1.1.1")
   .bytesConf(ByteUnit.BYTE)
   .createWithDefault(5 * 1024 * 1024)
 
   private[spark] val SHUFFLE_SPILL_BATCH_SIZE =
 ConfigBuilder("spark.shuffle.spill.batchSize")
   .internal()
   .doc("Size of object batches when reading/writing from serializers.")
+  .version("0.9.0")
   .longConf
   .createWithDefault(1)
 
   private[spark] val SHUFFLE_SORT_BYPASS_MERGE_THRESHOLD =
 ConfigBuilder("spark.shuffle.sort.bypassMergeThreshold")
   .doc("In the sort-based shuffle manager, avoid merge-sorting data if 
there is no " +
 "map-side aggregation and there are at most this many reduce 
partitions")
+  .version("1.1.1")
   .intConf
   .createWithDefault(200)
 
   private[spark] val SHUFFLE_MANAGER =
 ConfigBuilder("spark.shuffle.manager")
+  .version("1.1.0")
   .stringConf
   .createWithDefault("sort")
 
   private[spark] val SHUFFLE_REDUCE_LOCALITY_ENABLE =
 ConfigBuilder("spark.shuffle.reduceLocality.enabled")
   .doc("Whether to compute locality preferences for reduce tasks")
+  .version("1.5.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val SHUFFLE_MAPOUTPUT_MIN_SIZE_FOR_BROADCAST =
 ConfigBuilder("spark.shuffle.mapOutput.minSizeForBroadcast")
   .doc("The size at which we use Broadcast to send the map output statuses 
to the executors.")
+  .version("2.0.0")
   .bytesConf(ByteUnit.BYTE)
   .createWithDefaultString("512k")
 
   private[spark] val SHUFFLE_MAPOUTPUT_DISPATCHER_NUM_THREADS =
 ConfigBuilder("spark.shuffle.mapOutput.dispatcher.numThreads")
+  .version("2.0.0")
   .intConf
   .createWithDefault(8)
 
   private[spark] val SHUFFLE_DETECT_CORRUPT =
 ConfigBuilder("spark.shuffle.detectCorrupt")
   .doc("Whether to detect any corruption in fetched blocks.")
+  .version("2.2.0")
 
 Review comment:
   SPARK-4105, commit ID: 
cf33a86285629abe72c1acf235b8bfa6057220a8#diff-eb30a71e0d04150b8e0b64929852e38b


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392579437
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1238,32 +1306,37 @@ package object config {
 "by using extra memory to detect early corruption. Any IOException 
thrown will cause " +
 "the task to be retried once and if it fails again with same 
exception, then " +
 "FetchFailedException will be thrown to retry previous stage")
+  .version("3.0.0")
 
 Review comment:
   SPARK-26089, commit ID: 
688b0c01fac0db80f6473181673a89f1ce1be65b#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392579465
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1238,32 +1306,37 @@ package object config {
 "by using extra memory to detect early corruption. Any IOException 
thrown will cause " +
 "the task to be retried once and if it fails again with same 
exception, then " +
 "FetchFailedException will be thrown to retry previous stage")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(false)
 
   private[spark] val SHUFFLE_SYNC =
 ConfigBuilder("spark.shuffle.sync")
   .doc("Whether to force outstanding writes to disk.")
+  .version("0.8.0")
 
 Review comment:
   No JIRA ID, commit ID: 
31da065b1d08c1fad5283e4bcf8e0ed01818c03e#diff-ad46ed23fcc3fa87f30d05204917b917


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392579334
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1187,48 +1247,56 @@ package object config {
   .internal()
   .doc("Initial threshold for the size of a collection before we start 
tracking its " +
 "memory usage.")
+  .version("1.1.1")
   .bytesConf(ByteUnit.BYTE)
   .createWithDefault(5 * 1024 * 1024)
 
   private[spark] val SHUFFLE_SPILL_BATCH_SIZE =
 ConfigBuilder("spark.shuffle.spill.batchSize")
   .internal()
   .doc("Size of object batches when reading/writing from serializers.")
+  .version("0.9.0")
   .longConf
   .createWithDefault(1)
 
   private[spark] val SHUFFLE_SORT_BYPASS_MERGE_THRESHOLD =
 ConfigBuilder("spark.shuffle.sort.bypassMergeThreshold")
   .doc("In the sort-based shuffle manager, avoid merge-sorting data if 
there is no " +
 "map-side aggregation and there are at most this many reduce 
partitions")
+  .version("1.1.1")
   .intConf
   .createWithDefault(200)
 
   private[spark] val SHUFFLE_MANAGER =
 ConfigBuilder("spark.shuffle.manager")
+  .version("1.1.0")
   .stringConf
   .createWithDefault("sort")
 
   private[spark] val SHUFFLE_REDUCE_LOCALITY_ENABLE =
 ConfigBuilder("spark.shuffle.reduceLocality.enabled")
   .doc("Whether to compute locality preferences for reduce tasks")
+  .version("1.5.0")
 
 Review comment:
   SPARK-2774, commit ID: 
96a7c888d806adfdb2c722025a1079ed7eaa2052#diff-6a9ff7fb74fd490a50462d45db2d5e11


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392579371
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1187,48 +1247,56 @@ package object config {
   .internal()
   .doc("Initial threshold for the size of a collection before we start 
tracking its " +
 "memory usage.")
+  .version("1.1.1")
   .bytesConf(ByteUnit.BYTE)
   .createWithDefault(5 * 1024 * 1024)
 
   private[spark] val SHUFFLE_SPILL_BATCH_SIZE =
 ConfigBuilder("spark.shuffle.spill.batchSize")
   .internal()
   .doc("Size of object batches when reading/writing from serializers.")
+  .version("0.9.0")
   .longConf
   .createWithDefault(1)
 
   private[spark] val SHUFFLE_SORT_BYPASS_MERGE_THRESHOLD =
 ConfigBuilder("spark.shuffle.sort.bypassMergeThreshold")
   .doc("In the sort-based shuffle manager, avoid merge-sorting data if 
there is no " +
 "map-side aggregation and there are at most this many reduce 
partitions")
+  .version("1.1.1")
   .intConf
   .createWithDefault(200)
 
   private[spark] val SHUFFLE_MANAGER =
 ConfigBuilder("spark.shuffle.manager")
+  .version("1.1.0")
   .stringConf
   .createWithDefault("sort")
 
   private[spark] val SHUFFLE_REDUCE_LOCALITY_ENABLE =
 ConfigBuilder("spark.shuffle.reduceLocality.enabled")
   .doc("Whether to compute locality preferences for reduce tasks")
+  .version("1.5.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val SHUFFLE_MAPOUTPUT_MIN_SIZE_FOR_BROADCAST =
 ConfigBuilder("spark.shuffle.mapOutput.minSizeForBroadcast")
   .doc("The size at which we use Broadcast to send the map output statuses 
to the executors.")
+  .version("2.0.0")
   .bytesConf(ByteUnit.BYTE)
   .createWithDefaultString("512k")
 
   private[spark] val SHUFFLE_MAPOUTPUT_DISPATCHER_NUM_THREADS =
 ConfigBuilder("spark.shuffle.mapOutput.dispatcher.numThreads")
+  .version("2.0.0")
 
 Review comment:
   SPARK-1239, commit ID: 
d98dd72e7baeb59eacec4fefd66397513a607b2f#diff-609c3f8c26150ca96a94cd27146a809b


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392579365
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1187,48 +1247,56 @@ package object config {
   .internal()
   .doc("Initial threshold for the size of a collection before we start 
tracking its " +
 "memory usage.")
+  .version("1.1.1")
   .bytesConf(ByteUnit.BYTE)
   .createWithDefault(5 * 1024 * 1024)
 
   private[spark] val SHUFFLE_SPILL_BATCH_SIZE =
 ConfigBuilder("spark.shuffle.spill.batchSize")
   .internal()
   .doc("Size of object batches when reading/writing from serializers.")
+  .version("0.9.0")
   .longConf
   .createWithDefault(1)
 
   private[spark] val SHUFFLE_SORT_BYPASS_MERGE_THRESHOLD =
 ConfigBuilder("spark.shuffle.sort.bypassMergeThreshold")
   .doc("In the sort-based shuffle manager, avoid merge-sorting data if 
there is no " +
 "map-side aggregation and there are at most this many reduce 
partitions")
+  .version("1.1.1")
   .intConf
   .createWithDefault(200)
 
   private[spark] val SHUFFLE_MANAGER =
 ConfigBuilder("spark.shuffle.manager")
+  .version("1.1.0")
   .stringConf
   .createWithDefault("sort")
 
   private[spark] val SHUFFLE_REDUCE_LOCALITY_ENABLE =
 ConfigBuilder("spark.shuffle.reduceLocality.enabled")
   .doc("Whether to compute locality preferences for reduce tasks")
+  .version("1.5.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val SHUFFLE_MAPOUTPUT_MIN_SIZE_FOR_BROADCAST =
 ConfigBuilder("spark.shuffle.mapOutput.minSizeForBroadcast")
   .doc("The size at which we use Broadcast to send the map output statuses 
to the executors.")
+  .version("2.0.0")
 
 Review comment:
   SPARK-1239, commit ID: 
d98dd72e7baeb59eacec4fefd66397513a607b2f#diff-609c3f8c26150ca96a94cd27146a809b


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392579325
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1187,48 +1247,56 @@ package object config {
   .internal()
   .doc("Initial threshold for the size of a collection before we start 
tracking its " +
 "memory usage.")
+  .version("1.1.1")
   .bytesConf(ByteUnit.BYTE)
   .createWithDefault(5 * 1024 * 1024)
 
   private[spark] val SHUFFLE_SPILL_BATCH_SIZE =
 ConfigBuilder("spark.shuffle.spill.batchSize")
   .internal()
   .doc("Size of object batches when reading/writing from serializers.")
+  .version("0.9.0")
   .longConf
   .createWithDefault(1)
 
   private[spark] val SHUFFLE_SORT_BYPASS_MERGE_THRESHOLD =
 ConfigBuilder("spark.shuffle.sort.bypassMergeThreshold")
   .doc("In the sort-based shuffle manager, avoid merge-sorting data if 
there is no " +
 "map-side aggregation and there are at most this many reduce 
partitions")
+  .version("1.1.1")
   .intConf
   .createWithDefault(200)
 
   private[spark] val SHUFFLE_MANAGER =
 ConfigBuilder("spark.shuffle.manager")
+  .version("1.1.0")
 
 Review comment:
   SPARK-2044, commit ID: 
508fd371d6dbb826fd8a00787d347235b549e189#diff-60df49b5d3c59f2c4540fa16a90033a1


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392579304
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1187,48 +1247,56 @@ package object config {
   .internal()
   .doc("Initial threshold for the size of a collection before we start 
tracking its " +
 "memory usage.")
+  .version("1.1.1")
   .bytesConf(ByteUnit.BYTE)
   .createWithDefault(5 * 1024 * 1024)
 
   private[spark] val SHUFFLE_SPILL_BATCH_SIZE =
 ConfigBuilder("spark.shuffle.spill.batchSize")
   .internal()
   .doc("Size of object batches when reading/writing from serializers.")
+  .version("0.9.0")
   .longConf
   .createWithDefault(1)
 
   private[spark] val SHUFFLE_SORT_BYPASS_MERGE_THRESHOLD =
 ConfigBuilder("spark.shuffle.sort.bypassMergeThreshold")
   .doc("In the sort-based shuffle manager, avoid merge-sorting data if 
there is no " +
 "map-side aggregation and there are at most this many reduce 
partitions")
+  .version("1.1.1")
 
 Review comment:
   SPARK-2787, commit ID: 
0f2274f8ed6131ad17326e3fff7f7e093863b72d#diff-31417c461d8901d8e08167b0cbc344c1


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392579231
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1179,6 +1238,7 @@ package object config {
   .doc("The codec used to compress MapStatus, which is generated by 
ShuffleMapTask. " +
 "By default, Spark provides four codecs: lz4, lzf, snappy, and zstd. 
You can also " +
 "use fully qualified class names to specify the codec.")
+  .version("3.0.0")
 
 Review comment:
   SPARK-29939, commit ID: 
456cfe6e4693efd26d64f089d53c4e01bf8150a2#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392579273
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1187,48 +1247,56 @@ package object config {
   .internal()
   .doc("Initial threshold for the size of a collection before we start 
tracking its " +
 "memory usage.")
+  .version("1.1.1")
   .bytesConf(ByteUnit.BYTE)
   .createWithDefault(5 * 1024 * 1024)
 
   private[spark] val SHUFFLE_SPILL_BATCH_SIZE =
 ConfigBuilder("spark.shuffle.spill.batchSize")
   .internal()
   .doc("Size of object batches when reading/writing from serializers.")
+  .version("0.9.0")
 
 Review comment:
   No JIRA ID, commit ID: 
c3816de5040e3c48e58ed4762d2f4eb606812938#diff-a470b9812a5ac8c37d732da7d9fbe39a


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392579261
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1187,48 +1247,56 @@ package object config {
   .internal()
   .doc("Initial threshold for the size of a collection before we start 
tracking its " +
 "memory usage.")
+  .version("1.1.1")
 
 Review comment:
   SPARK-4480, commit ID: 
16bf5f3d17624db2a96c921fe8a1e153cdafb06c#diff-31417c461d8901d8e08167b0cbc344c1


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392579195
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1163,13 +1220,15 @@ package object config {
 ConfigBuilder("spark.shuffle.compress")
   .doc("Whether to compress shuffle output. Compression will use " +
 "spark.io.compression.codec.")
+  .version("0.6.0")
 
 Review comment:
   No JIRA ID, commit ID: 
efc5423210d1aadeaea78273a4a8f10425753079#diff-76170a9c8f67b542bc58240a0a12fe08


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392579204
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1163,13 +1220,15 @@ package object config {
 ConfigBuilder("spark.shuffle.compress")
   .doc("Whether to compress shuffle output. Compression will use " +
 "spark.io.compression.codec.")
+  .version("0.6.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val SHUFFLE_SPILL_COMPRESS =
 ConfigBuilder("spark.shuffle.spill.compress")
   .doc("Whether to compress data spilled during shuffles. Compression will 
use " +
 "spark.io.compression.codec.")
+  .version("0.9.0")
 
 Review comment:
   No JIRA ID, commit ID: 
c3816de5040e3c48e58ed4762d2f4eb606812938#diff-2b643ea78c1add0381754b1f47eec132


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392579183
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1130,30 +1182,35 @@ package object config {
   .doc("Multi-thread is used when the number of mappers * shuffle 
partitions is greater than " +
 "or equal to this threshold. Note that the actual parallelism is 
calculated by number of " +
 "mappers * shuffle partitions / this threshold + 1, so this threshold 
should be positive.")
+  .version("2.3.0")
   .intConf
   .checkValue(v => v > 0, "The threshold should be positive.")
   .createWithDefault(1000)
 
   private[spark] val MAX_RESULT_SIZE = 
ConfigBuilder("spark.driver.maxResultSize")
 .doc("Size limit for results.")
+.version("1.2.0")
 .bytesConf(ByteUnit.BYTE)
 .createWithDefaultString("1g")
 
   private[spark] val CREDENTIALS_RENEWAL_INTERVAL_RATIO =
 ConfigBuilder("spark.security.credentials.renewalRatio")
   .doc("Ratio of the credential's expiration time when Spark should fetch 
new credentials.")
+  .version("2.4.0")
   .doubleConf
   .createWithDefault(0.75d)
 
   private[spark] val CREDENTIALS_RENEWAL_RETRY_WAIT =
 ConfigBuilder("spark.security.credentials.retryWait")
   .doc("How long to wait before retrying to fetch new credentials after a 
failure.")
+  .version("2.4.0")
   .timeConf(TimeUnit.SECONDS)
   .createWithDefaultString("1h")
 
   private[spark] val SHUFFLE_SORT_INIT_BUFFER_SIZE =
 ConfigBuilder("spark.shuffle.sort.initialBufferSize")
   .internal()
+  .version("2.1.0")
 
 Review comment:
   SPARK-15958, commit ID: 
bf665a958631125a1670504ef5966ef1a0e14798#diff-a1d00506391c1c4b2209f9bbff590c5b


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392579133
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1130,30 +1182,35 @@ package object config {
   .doc("Multi-thread is used when the number of mappers * shuffle 
partitions is greater than " +
 "or equal to this threshold. Note that the actual parallelism is 
calculated by number of " +
 "mappers * shuffle partitions / this threshold + 1, so this threshold 
should be positive.")
+  .version("2.3.0")
   .intConf
   .checkValue(v => v > 0, "The threshold should be positive.")
   .createWithDefault(1000)
 
   private[spark] val MAX_RESULT_SIZE = 
ConfigBuilder("spark.driver.maxResultSize")
 .doc("Size limit for results.")
+.version("1.2.0")
 .bytesConf(ByteUnit.BYTE)
 .createWithDefaultString("1g")
 
   private[spark] val CREDENTIALS_RENEWAL_INTERVAL_RATIO =
 ConfigBuilder("spark.security.credentials.renewalRatio")
   .doc("Ratio of the credential's expiration time when Spark should fetch 
new credentials.")
+  .version("2.4.0")
 
 Review comment:
   SPARK-23361, commit ID: 
5fa438471110afbf4e2174df449ac79e292501f8#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392579142
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1130,30 +1182,35 @@ package object config {
   .doc("Multi-thread is used when the number of mappers * shuffle 
partitions is greater than " +
 "or equal to this threshold. Note that the actual parallelism is 
calculated by number of " +
 "mappers * shuffle partitions / this threshold + 1, so this threshold 
should be positive.")
+  .version("2.3.0")
   .intConf
   .checkValue(v => v > 0, "The threshold should be positive.")
   .createWithDefault(1000)
 
   private[spark] val MAX_RESULT_SIZE = 
ConfigBuilder("spark.driver.maxResultSize")
 .doc("Size limit for results.")
+.version("1.2.0")
 .bytesConf(ByteUnit.BYTE)
 .createWithDefaultString("1g")
 
   private[spark] val CREDENTIALS_RENEWAL_INTERVAL_RATIO =
 ConfigBuilder("spark.security.credentials.renewalRatio")
   .doc("Ratio of the credential's expiration time when Spark should fetch 
new credentials.")
+  .version("2.4.0")
   .doubleConf
   .createWithDefault(0.75d)
 
   private[spark] val CREDENTIALS_RENEWAL_RETRY_WAIT =
 ConfigBuilder("spark.security.credentials.retryWait")
   .doc("How long to wait before retrying to fetch new credentials after a 
failure.")
+  .version("2.4.0")
 
 Review comment:
   SPARK-23361, commit ID: 
5fa438471110afbf4e2174df449ac79e292501f8#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392579111
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1130,30 +1182,35 @@ package object config {
   .doc("Multi-thread is used when the number of mappers * shuffle 
partitions is greater than " +
 "or equal to this threshold. Note that the actual parallelism is 
calculated by number of " +
 "mappers * shuffle partitions / this threshold + 1, so this threshold 
should be positive.")
+  .version("2.3.0")
   .intConf
   .checkValue(v => v > 0, "The threshold should be positive.")
   .createWithDefault(1000)
 
   private[spark] val MAX_RESULT_SIZE = 
ConfigBuilder("spark.driver.maxResultSize")
 .doc("Size limit for results.")
+.version("1.2.0")
 
 Review comment:
   SPARK-3466, commit ID: 
6181577e9935f46b646ba3925b873d031aa3d6ba#diff-d239aee594001f8391676e1047a0381e


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392579036
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1121,6 +1172,7 @@ package object config {
 "By default it's Integer.MAX_VALUE, which means we never force the 
sorter to spill, " +
 "until we reach some limitations, like the max page size limitation 
for the pointer " +
 "array in the sorter.")
+  .version("1.6.0")
 
 Review comment:
   SPARK-10708, commit ID: 
f6d06adf05afa9c5386dc2396c94e7a98730289f#diff-3eedc75de4787b842477138d8cc7f150


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392579078
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1130,30 +1182,35 @@ package object config {
   .doc("Multi-thread is used when the number of mappers * shuffle 
partitions is greater than " +
 "or equal to this threshold. Note that the actual parallelism is 
calculated by number of " +
 "mappers * shuffle partitions / this threshold + 1, so this threshold 
should be positive.")
+  .version("2.3.0")
 
 Review comment:
   SPARK-22537, commit ID: 
efd0036ec88bdc385f5a9ea568d2e2bbfcda2912#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] wangyum commented on issue #27874: [WIP][SPARK-31114][SQL] Constraints inferred from equality constraints with cast

2020-03-14 Thread GitBox
wangyum commented on issue #27874: [WIP][SPARK-31114][SQL] Constraints inferred 
from equality constraints with cast
URL: https://github.com/apache/spark/pull/27874#issuecomment-599045928
 
 
   I closed it because this change cannot handle this case:
   ```scala
   spark.sql("create table T1(a string)")
   spark.sql("create table T2(b string)")
   spark.sql("create table T3(c bigint)")
   spark.sql("create table T4(d bigint)")
   
   spark.sql(
 """
   |SELECT t1.a, t2.b, t4.d
   |FROM T1 t1 JOIN T2 t2
   |   ON (t1.a = t2.b)
   | JOIN T3 t3
   |   ON (t1.a = t3.c)
   | JOIN T4 t4
   |   ON (t3.c = t4.d)
   |""".stripMargin).explain()
   ```


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392579015
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1104,12 +1153,14 @@ package object config {
 "where the YARN service does not support schemes that are supported by 
Spark, like http, " +
 "https and ftp, or jars required to be in the local YARN client's 
classpath. Wildcard " +
 "'*' is denoted to download resources for all the schemes.")
+  .version("2.3.0")
 
 Review comment:
   SPARK-21917, commit ID: 
8319432af60b8e1dc00f08d794f7d80591e24d0c#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] wangyum closed pull request #27874: [WIP][SPARK-31114][SQL] Constraints inferred from equality constraints with cast

2020-03-14 Thread GitBox
wangyum closed pull request #27874: [WIP][SPARK-31114][SQL] Constraints 
inferred from equality constraints with cast
URL: https://github.com/apache/spark/pull/27874
 
 
   


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392579029
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1104,12 +1153,14 @@ package object config {
 "where the YARN service does not support schemes that are supported by 
Spark, like http, " +
 "https and ftp, or jars required to be in the local YARN client's 
classpath. Wildcard " +
 "'*' is denoted to download resources for all the schemes.")
+  .version("2.3.0")
   .stringConf
   .toSequence
   .createWithDefault(Nil)
 
   private[spark] val EXTRA_LISTENERS = ConfigBuilder("spark.extraListeners")
 .doc("Class names of listeners to add to SparkContext during 
initialization.")
+.version("1.3.0")
 
 Review comment:
   SPARK-5411, commit ID: 
47e4d579eb4a9aab8e0dd9c1400394d80c8d0388#diff-364713d7776956cb8b0a771e9b62f82d


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578941
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1076,6 +1122,7 @@ package object config {
   private[spark] val SHUFFLE_DISK_WRITE_BUFFER_SIZE =
 ConfigBuilder("spark.shuffle.spill.diskWriteBufferSize")
   .doc("The buffer size, in bytes, to use when writing the sorted records 
to an on-disk file.")
+  .version("2.3.0")
 
 Review comment:
   SPARK-20950, commit ID: 
565e7a8d4ae7879ee704fb94ae9b3da31e202d7e#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578979
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1087,13 +1134,15 @@ package object config {
   .internal()
   .doc("The memory check period is used to determine how often we should 
check whether "
 + "there is a need to request more memory when we try to unroll the 
given block in memory.")
+  .version("2.3.0")
 
 Review comment:
   SPARK-21923, commit ID: 
a11db942aaf4c470a85f8a1b180f034f7a584254#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578985
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1087,13 +1134,15 @@ package object config {
   .internal()
   .doc("The memory check period is used to determine how often we should 
check whether "
 + "there is a need to request more memory when we try to unroll the 
given block in memory.")
+  .version("2.3.0")
   .longConf
   .createWithDefault(16)
 
   private[spark] val UNROLL_MEMORY_GROWTH_FACTOR =
 ConfigBuilder("spark.storage.unrollMemoryGrowthFactor")
   .internal()
   .doc("Memory to request as a multiple of the size that used to unroll 
the block.")
+  .version("2.3.0")
 
 Review comment:
   SPARK-21923, commit ID: 
a11db942aaf4c470a85f8a1b180f034f7a584254#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578930
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1067,6 +1112,7 @@ package object config {
 ConfigBuilder("spark.shuffle.unsafe.file.output.buffer")
   .doc("The file system for this buffer size after each partition " +
 "is written in unsafe shuffle writer. In KiB unless otherwise 
specified.")
+  .version("2.3.0")
 
 Review comment:
   SPARK-20950, commit ID: 
565e7a8d4ae7879ee704fb94ae9b3da31e202d7e#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578913
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1057,6 +1101,7 @@ package object config {
   .doc("Size of the in-memory buffer for each shuffle file output stream, 
in KiB unless " +
 "otherwise specified. These buffers reduce the number of disk seeks 
and system calls " +
 "made in creating intermediate shuffle files.")
+  .version("1.4.0")
 
 Review comment:
   SPARK-7081, commit ID: 
c53ebea9db418099df50f9adc1a18cee7849cd97#diff-ecdafc46b901740134261d2cab24ccd9


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578898
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1043,12 +1085,14 @@ package object config {
   .doc("Enable tracking of updatedBlockStatuses in the TaskMetrics. Off by 
default since " +
 "tracking the block statuses can use a lot of memory and its not used 
anywhere within " +
 "spark.")
+  .version("2.3.0")
   .booleanConf
   .createWithDefault(false)
 
   private[spark] val SHUFFLE_IO_PLUGIN_CLASS =
 ConfigBuilder("spark.shuffle.sort.io.plugin.class")
   .doc("Name of the class to use for shuffle IO.")
+  .version("3.0.0")
 
 Review comment:
   SPARK-28209, commit ID: 
abef84a868e9e15f346eea315bbab0ec8ac8e389#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578868
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1043,12 +1085,14 @@ package object config {
   .doc("Enable tracking of updatedBlockStatuses in the TaskMetrics. Off by 
default since " +
 "tracking the block statuses can use a lot of memory and its not used 
anywhere within " +
 "spark.")
+  .version("2.3.0")
 
 Review comment:
   SPARK-20923, commit ID: 
5b5a69bea9de806e2c39b04b248ee82a7b664d7b#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578847
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1028,6 +1069,7 @@ package object config {
 "configuration will affect both shuffle fetch and block manager remote 
block fetch. " +
 "For users who enabled external shuffle service, this feature can only 
work when " +
 "external shuffle service is at least 2.3.0.")
+  .version("3.0.0")
 
 Review comment:
   SPARK-26700, commit ID: 
d8613571bc1847775dd5c1945757279234cb388c#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578782
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -994,19 +1031,22 @@ package object config {
   .doc("Threshold in bytes above which the size of shuffle blocks in " +
 "HighlyCompressedMapStatus is accurately recorded. This helps to 
prevent OOM " +
 "by avoiding underestimating shuffle block size when fetch shuffle 
blocks.")
+  .version("2.2.1")
   .bytesConf(ByteUnit.BYTE)
   .createWithDefault(100 * 1024 * 1024)
 
   private[spark] val SHUFFLE_REGISTRATION_TIMEOUT =
 ConfigBuilder("spark.shuffle.registration.timeout")
   .doc("Timeout in milliseconds for registration to the external shuffle 
service.")
+  .version("2.3.0")
   .timeConf(TimeUnit.MILLISECONDS)
   .createWithDefault(5000)
 
   private[spark] val SHUFFLE_REGISTRATION_MAX_ATTEMPTS =
 ConfigBuilder("spark.shuffle.registration.maxAttempts")
   .doc("When we fail to register to the external shuffle service, we will 
" +
 "retry for maxAttempts times.")
+  .version("2.3.0")
 
 Review comment:
   SPARK-20640, commit ID: 
d107b3b910d8f434fb15b663a9db4c2dfe0a9f43#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578767
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -994,19 +1031,22 @@ package object config {
   .doc("Threshold in bytes above which the size of shuffle blocks in " +
 "HighlyCompressedMapStatus is accurately recorded. This helps to 
prevent OOM " +
 "by avoiding underestimating shuffle block size when fetch shuffle 
blocks.")
+  .version("2.2.1")
   .bytesConf(ByteUnit.BYTE)
   .createWithDefault(100 * 1024 * 1024)
 
   private[spark] val SHUFFLE_REGISTRATION_TIMEOUT =
 ConfigBuilder("spark.shuffle.registration.timeout")
   .doc("Timeout in milliseconds for registration to the external shuffle 
service.")
+  .version("2.3.0")
 
 Review comment:
   SPARK-20640, commit ID: 
d107b3b910d8f434fb15b663a9db4c2dfe0a9f43#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578811
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -1017,6 +1057,7 @@ package object config {
 "address in a single fetch or simultaneously, this could crash the 
serving executor or " +
 "Node Manager. This is especially useful to reduce the load on the 
Node Manager when " +
 "external shuffle is enabled. You can mitigate the issue by setting it 
to a lower value.")
+  .version("2.2.1")
 
 Review comment:
   SPARK-21243, commit ID: 
88dccda393bc79dc6032f71b6acf8eb2b4b152be#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578751
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -994,19 +1031,22 @@ package object config {
   .doc("Threshold in bytes above which the size of shuffle blocks in " +
 "HighlyCompressedMapStatus is accurately recorded. This helps to 
prevent OOM " +
 "by avoiding underestimating shuffle block size when fetch shuffle 
blocks.")
+  .version("2.2.1")
 
 Review comment:
   SPARK-20801, commit ID: 
81f63c8923416014d5c6bc227dd3c4e2a62bac8e#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578726
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -985,6 +1021,7 @@ package object config {
 "Caching preferred locations can relieve query loading to DFS and save 
the query " +
 "time. The drawback is that the cached locations can be possibly 
outdated and " +
 "lose data locality. If this config is not specified, it will not 
cache.")
+  .version("3.0.0")
 
 Review comment:
   SPARK-29182, commit ID: 
4ecbdbb6a7bd3908da32c82832e886b4f9f9e596#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578694
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -975,6 +1010,7 @@ package object config {
 ConfigBuilder("spark.checkpoint.compress")
   .doc("Whether to compress RDD checkpoints. Generally a good idea. 
Compression will use " +
 "spark.io.compression.codec.")
+  .version("2.2.0")
 
 Review comment:
   SPARK-19525, commit ID: 
1405862382185e04b09f84af18f82f2f0295a755#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578673
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -959,12 +992,14 @@ package object config {
 "specified for the executors. The fallback configuration allows the 
same path to be " +
 "used for both the driver and the executors when running in cluster 
mode. File-based " +
 "secret keys are only allowed when using Kubernetes.")
+  .version("3.0.0")
   .fallbackConf(AUTH_SECRET_FILE)
 
   private[spark] val BUFFER_WRITE_CHUNK_SIZE =
 ConfigBuilder("spark.buffer.write.chunkSize")
   .internal()
   .doc("The chunk size in bytes during writing out the bytes of 
ChunkedByteBuffer.")
+  .version("2.3.0")
 
 Review comment:
   SPARK-21527, commit ID: 
574ef6c987c636210828e96d2f797d8f10aff05e#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578647
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -959,12 +992,14 @@ package object config {
 "specified for the executors. The fallback configuration allows the 
same path to be " +
 "used for both the driver and the executors when running in cluster 
mode. File-based " +
 "secret keys are only allowed when using Kubernetes.")
+  .version("3.0.0")
 
 Review comment:
   SPARK-26239, commit ID: 
57d6fbfa8c803ce1791e7be36aba0219a1fcaa63#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578640
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -948,6 +980,7 @@ package object config {
 "be specified for the executors. The fallback configuration allows the 
same path to be " +
 "used for both the driver and the executors when running in cluster 
mode. File-based " +
 "secret keys are only allowed when using Kubernetes.")
+  .version("3.0.0")
 
 Review comment:
   SPARK-26239, commit ID: 
57d6fbfa8c803ce1791e7be36aba0219a1fcaa63#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578633
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -936,6 +967,7 @@ package object config {
 "loaded from this path on both the driver and the executors if 
overrides are not set for " +
 "either entity (see below). File-based secret keys are only allowed 
when using " +
 "Kubernetes.")
+  .version("3.0.0")
 
 Review comment:
   SPARK-26239, commit ID: 
57d6fbfa8c803ce1791e7be36aba0219a1fcaa63#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578580
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -907,26 +933,31 @@ package object config {
   .doc("Regex to decide which parts of strings produced by Spark contain 
sensitive " +
 "information. When this regex matches a string part, that string part 
is replaced by a " +
 "dummy value. This is currently used to redact the output of SQL 
explain commands.")
+  .version("2.2.0")
   .regexConf
   .createOptional
 
   private[spark] val AUTH_SECRET =
 ConfigBuilder("spark.authenticate.secret")
+  .version("1.0.0")
   .stringConf
   .createOptional
 
   private[spark] val AUTH_SECRET_BIT_LENGTH =
 ConfigBuilder("spark.authenticate.secretBitLength")
+  .version("1.6.0")
   .intConf
   .createWithDefault(256)
 
   private[spark] val NETWORK_AUTH_ENABLED =
 ConfigBuilder("spark.authenticate")
+  .version("1.0.0")
   .booleanConf
   .createWithDefault(false)
 
   private[spark] val SASL_ENCRYPTION_ENABLED =
 ConfigBuilder("spark.authenticate.enableSaslEncryption")
+  .version("1.4.0")
 
 Review comment:
   SPARK-6229, commit ID: 
38d4e9e446b425ca6a8fe8d8080f387b08683842#diff-afd88f677ec5ff8b5e96a5cbbe00cd98


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578512
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -907,26 +933,31 @@ package object config {
   .doc("Regex to decide which parts of strings produced by Spark contain 
sensitive " +
 "information. When this regex matches a string part, that string part 
is replaced by a " +
 "dummy value. This is currently used to redact the output of SQL 
explain commands.")
+  .version("2.2.0")
   .regexConf
   .createOptional
 
   private[spark] val AUTH_SECRET =
 ConfigBuilder("spark.authenticate.secret")
+  .version("1.0.0")
   .stringConf
   .createOptional
 
   private[spark] val AUTH_SECRET_BIT_LENGTH =
 ConfigBuilder("spark.authenticate.secretBitLength")
+  .version("1.6.0")
   .intConf
   .createWithDefault(256)
 
   private[spark] val NETWORK_AUTH_ENABLED =
 ConfigBuilder("spark.authenticate")
+  .version("1.0.0")
 
 Review comment:
   SPARK-1189, commit ID: 
7edbea41b43e0dc11a2de156be220db8b7952d01#diff-afd88f677ec5ff8b5e96a5cbbe00cd98


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578540
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -907,26 +933,31 @@ package object config {
   .doc("Regex to decide which parts of strings produced by Spark contain 
sensitive " +
 "information. When this regex matches a string part, that string part 
is replaced by a " +
 "dummy value. This is currently used to redact the output of SQL 
explain commands.")
+  .version("2.2.0")
   .regexConf
   .createOptional
 
   private[spark] val AUTH_SECRET =
 ConfigBuilder("spark.authenticate.secret")
+  .version("1.0.0")
   .stringConf
   .createOptional
 
   private[spark] val AUTH_SECRET_BIT_LENGTH =
 ConfigBuilder("spark.authenticate.secretBitLength")
+  .version("1.6.0")
 
 Review comment:
   SPARK-11073, commit ID: 
f8d93edec82eedab59d50aec06ca2de7e4cf14f6#diff-afd88f677ec5ff8b5e96a5cbbe00cd98


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins removed a comment on issue #27517: [SPARK-29721][SQL] Prune unnecessary nested fields from Generate without Project

2020-03-14 Thread GitBox
AmplabJenkins removed a comment on issue #27517: [SPARK-29721][SQL] Prune 
unnecessary nested fields from Generate without Project
URL: https://github.com/apache/spark/pull/27517#issuecomment-599044459
 
 
   Test PASSed.
   Refer to this link for build results (access rights to CI server needed): 
   https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/119789/
   Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578508
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -907,26 +933,31 @@ package object config {
   .doc("Regex to decide which parts of strings produced by Spark contain 
sensitive " +
 "information. When this regex matches a string part, that string part 
is replaced by a " +
 "dummy value. This is currently used to redact the output of SQL 
explain commands.")
+  .version("2.2.0")
   .regexConf
   .createOptional
 
   private[spark] val AUTH_SECRET =
 ConfigBuilder("spark.authenticate.secret")
+  .version("1.0.0")
 
 Review comment:
   SPARK-1189, commit ID: 
7edbea41b43e0dc11a2de156be220db8b7952d01#diff-afd88f677ec5ff8b5e96a5cbbe00cd98


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins removed a comment on issue #27517: [SPARK-29721][SQL] Prune unnecessary nested fields from Generate without Project

2020-03-14 Thread GitBox
AmplabJenkins removed a comment on issue #27517: [SPARK-29721][SQL] Prune 
unnecessary nested fields from Generate without Project
URL: https://github.com/apache/spark/pull/27517#issuecomment-599044457
 
 
   Merged build finished. Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins commented on issue #27517: [SPARK-29721][SQL] Prune unnecessary nested fields from Generate without Project

2020-03-14 Thread GitBox
AmplabJenkins commented on issue #27517: [SPARK-29721][SQL] Prune unnecessary 
nested fields from Generate without Project
URL: https://github.com/apache/spark/pull/27517#issuecomment-599044459
 
 
   Test PASSed.
   Refer to this link for build results (access rights to CI server needed): 
   https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/119789/
   Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578483
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -899,6 +924,7 @@ package object config {
 "driver and executor environments contain sensitive information. When 
this regex matches " +
 "a property key or value, the value is redacted from the environment 
UI and various logs " +
 "like YARN and event logs.")
+  .version("2.1.2")
 
 Review comment:
   SPARK-18535 and SPARK-19720, commit ID: 
444cca14d7ac8c5ab5d7e9d080b11f4d6babe3bf#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins commented on issue #27517: [SPARK-29721][SQL] Prune unnecessary nested fields from Generate without Project

2020-03-14 Thread GitBox
AmplabJenkins commented on issue #27517: [SPARK-29721][SQL] Prune unnecessary 
nested fields from Generate without Project
URL: https://github.com/apache/spark/pull/27517#issuecomment-599044457
 
 
   Merged build finished. Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578492
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -907,26 +933,31 @@ package object config {
   .doc("Regex to decide which parts of strings produced by Spark contain 
sensitive " +
 "information. When this regex matches a string part, that string part 
is replaced by a " +
 "dummy value. This is currently used to redact the output of SQL 
explain commands.")
+  .version("2.2.0")
 
 Review comment:
   SPARK-20070, commit ID: 
91fa80fe8a2480d64c430bd10f97b3d44c007bcc#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578431
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -776,105 +776,128 @@ package object config {
 
   // This property sets the root namespace for metrics reporting
   private[spark] val METRICS_NAMESPACE = 
ConfigBuilder("spark.metrics.namespace")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_CONF = ConfigBuilder("spark.metrics.conf")
+.version("0.8.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_EXECUTORMETRICS_SOURCE_ENABLED =
 ConfigBuilder("spark.metrics.executorMetricsSource.enabled")
   .doc("Whether to register the ExecutorMetrics source with the metrics 
system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val METRICS_STATIC_SOURCES_ENABLED =
 ConfigBuilder("spark.metrics.staticSources.enabled")
   .doc("Whether to register static sources with the metrics system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val PYSPARK_DRIVER_PYTHON = 
ConfigBuilder("spark.pyspark.driver.python")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val PYSPARK_PYTHON = ConfigBuilder("spark.pyspark.python")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   // To limit how many applications are shown in the History Server summary ui
   private[spark] val HISTORY_UI_MAX_APPS =
-
ConfigBuilder("spark.history.ui.maxApplications").intConf.createWithDefault(Integer.MAX_VALUE)
+ConfigBuilder("spark.history.ui.maxApplications")
+  .version("2.0.1")
+  .intConf
+  .createWithDefault(Integer.MAX_VALUE)
 
   private[spark] val IO_ENCRYPTION_ENABLED = 
ConfigBuilder("spark.io.encryption.enabled")
+.version("2.1.0")
 .booleanConf
 .createWithDefault(false)
 
   private[spark] val IO_ENCRYPTION_KEYGEN_ALGORITHM =
 ConfigBuilder("spark.io.encryption.keygen.algorithm")
+  .version("2.1.0")
   .stringConf
   .createWithDefault("HmacSHA1")
 
   private[spark] val IO_ENCRYPTION_KEY_SIZE_BITS = 
ConfigBuilder("spark.io.encryption.keySizeBits")
+.version("2.1.0")
 .intConf
 .checkValues(Set(128, 192, 256))
 .createWithDefault(128)
 
   private[spark] val IO_CRYPTO_CIPHER_TRANSFORMATION =
 ConfigBuilder("spark.io.crypto.cipher.transformation")
   .internal()
+  .version("2.1.0")
   .stringConf
   .createWithDefaultString("AES/CTR/NoPadding")
 
   private[spark] val DRIVER_HOST_ADDRESS = ConfigBuilder("spark.driver.host")
 .doc("Address of driver endpoints.")
+.version("0.7.0")
 .stringConf
 .createWithDefault(Utils.localCanonicalHostName())
 
   private[spark] val DRIVER_PORT = ConfigBuilder("spark.driver.port")
 .doc("Port of driver endpoints.")
+.version("0.7.0")
 .intConf
 .createWithDefault(0)
 
   private[spark] val DRIVER_SUPERVISE = ConfigBuilder("spark.driver.supervise")
 .doc("If true, restarts the driver automatically if it fails with a 
non-zero exit status. " +
   "Only has effect in Spark standalone mode or Mesos cluster deploy mode.")
+.version("1.3.0")
 .booleanConf
 .createWithDefault(false)
 
   private[spark] val DRIVER_BIND_ADDRESS = 
ConfigBuilder("spark.driver.bindAddress")
 .doc("Address where to bind network listen sockets on the driver.")
+.version("2.1.0")
 .fallbackConf(DRIVER_HOST_ADDRESS)
 
   private[spark] val BLOCK_MANAGER_PORT = 
ConfigBuilder("spark.blockManager.port")
 .doc("Port to use for the block manager when a more specific setting is 
not provided.")
+.version("1.1.0")
 .intConf
 .createWithDefault(0)
 
   private[spark] val DRIVER_BLOCK_MANAGER_PORT = 
ConfigBuilder("spark.driver.blockManager.port")
 .doc("Port to use for the block manager on the driver.")
+.version("2.1.0")
 .fallbackConf(BLOCK_MANAGER_PORT)
 
   private[spark] val IGNORE_CORRUPT_FILES = 
ConfigBuilder("spark.files.ignoreCorruptFiles")
 .doc("Whether to ignore corrupt files. If true, the Spark jobs will 
continue to run when " +
   "encountering corrupted or non-existing files and contents that have 
been read will still " +
   "be returned.")
+.version("2.1.0")
 .booleanConf
 .createWithDefault(false)
 
   private[spark] val IGNORE_MISSING_FILES = 
ConfigBuilder("spark.files.ignoreMissingFiles")
 .doc("Whether to ignore missing files. If true, the Spark jobs will 
continue to run when " +
   "encountering missing files and the contents that have been read will 
still be returned.")
+.version("2.4.0")
 .booleanConf
 .createWithDefault(false)
 
   private[spark] val APP_CALLER_CONTEXT = 
ConfigBuilder("s

[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578466
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -883,13 +906,15 @@ package object config {
   " the same time. This is used when putting multiple files into a 
partition. It's better to" +
   " over estimate, then the partitions with small files will be faster 
than partitions with" +
   " bigger files.")
+.version("2.1.0")
 .bytesConf(ByteUnit.BYTE)
 .createWithDefault(4 * 1024 * 1024)
 
   private[spark] val HADOOP_RDD_IGNORE_EMPTY_SPLITS =
 ConfigBuilder("spark.hadoopRDD.ignoreEmptySplits")
   .internal()
   .doc("When true, HadoopRDD/NewHadoopRDD will not create partitions for 
empty input splits.")
+  .version("2.3.0")
 
 Review comment:
   SPARK-22233, commit ID: 
0fa10666cf75e3c4929940af49c8a6f6ea874759#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins removed a comment on issue #27906: [SPARK-31150][SQL] Parsing seconds fraction with variable length for timestamp

2020-03-14 Thread GitBox
AmplabJenkins removed a comment on issue #27906: [SPARK-31150][SQL] Parsing 
seconds fraction with variable length for timestamp
URL: https://github.com/apache/spark/pull/27906#issuecomment-599044149
 
 
   Merged build finished. Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] SparkQA removed a comment on issue #27517: [SPARK-29721][SQL] Prune unnecessary nested fields from Generate without Project

2020-03-14 Thread GitBox
SparkQA removed a comment on issue #27517: [SPARK-29721][SQL] Prune unnecessary 
nested fields from Generate without Project
URL: https://github.com/apache/spark/pull/27517#issuecomment-599022031
 
 
   **[Test build #119789 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/119789/testReport)**
 for PR 27517 at commit 
[`289355f`](https://github.com/apache/spark/commit/289355f7bae4b05f3e09ee4e38b9bbf4ca9b45de).


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins removed a comment on issue #27906: [SPARK-31150][SQL] Parsing seconds fraction with variable length for timestamp

2020-03-14 Thread GitBox
AmplabJenkins removed a comment on issue #27906: [SPARK-31150][SQL] Parsing 
seconds fraction with variable length for timestamp
URL: https://github.com/apache/spark/pull/27906#issuecomment-599044153
 
 
   Test PASSed.
   Refer to this link for build results (access rights to CI server needed): 
   https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/119790/
   Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578435
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -883,13 +906,15 @@ package object config {
   " the same time. This is used when putting multiple files into a 
partition. It's better to" +
   " over estimate, then the partitions with small files will be faster 
than partitions with" +
   " bigger files.")
+.version("2.1.0")
 
 Review comment:
   SPARK-16575, commit ID: 
c8879bf1ee2af9ccd5d5656571d931d2fc1da024#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] maropu commented on issue #27872: [SPARK-31115][SQL] Detect known Janino bug janino-compiler/janino#113 and apply workaround automatically as a fail-back via avoid using switch statem

2020-03-14 Thread GitBox
maropu commented on issue #27872: [SPARK-31115][SQL] Detect known Janino bug 
janino-compiler/janino#113 and apply workaround automatically as a fail-back 
via avoid using switch statement in generated code
URL: https://github.com/apache/spark/pull/27872#issuecomment-599044117
 
 
   I think the option 4 looks fine to me. btw, splitting large code into pieces 
in `switch` is a solution for this issue? Additionally, we need to replace 
`switch` with `if`?
   
   > Modify ExpandExec to check the number of operations in for statement, and 
use if ~ else if when the number of operations exceed the threshold. This 
should be ideally checking the length of offset but it would be weird if Spark 
does it, so count the lines blindly. Performance regression may happen in some 
cases where it can run with switch but due to blind count it runs with if ~ 
else if, but the case wouldn't be common.
   
   I just want to know the actual performance numbers of this approach. I think 
splitting large code into small parts might improve performance.
   
   > I have one, but I cannot share since the query is from actual customer. If 
you're OK with just generated code, I've attached the file in Janino issue 
janino-compiler/janino#113.
   
   To reproduce the issue, could you build the simple query that you can show 
us based on your private customer's query? I think the query can make us 
understood more for the issue.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] SparkQA commented on issue #27517: [SPARK-29721][SQL] Prune unnecessary nested fields from Generate without Project

2020-03-14 Thread GitBox
SparkQA commented on issue #27517: [SPARK-29721][SQL] Prune unnecessary nested 
fields from Generate without Project
URL: https://github.com/apache/spark/pull/27517#issuecomment-599044181
 
 
   **[Test build #119789 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/119789/testReport)**
 for PR 27517 at commit 
[`289355f`](https://github.com/apache/spark/commit/289355f7bae4b05f3e09ee4e38b9bbf4ca9b45de).
* This patch passes all tests.
* This patch merges cleanly.
* This patch adds no public classes.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins commented on issue #27906: [SPARK-31150][SQL] Parsing seconds fraction with variable length for timestamp

2020-03-14 Thread GitBox
AmplabJenkins commented on issue #27906: [SPARK-31150][SQL] Parsing seconds 
fraction with variable length for timestamp
URL: https://github.com/apache/spark/pull/27906#issuecomment-599044153
 
 
   Test PASSed.
   Refer to this link for build results (access rights to CI server needed): 
   https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/119790/
   Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins commented on issue #27906: [SPARK-31150][SQL] Parsing seconds fraction with variable length for timestamp

2020-03-14 Thread GitBox
AmplabJenkins commented on issue #27906: [SPARK-31150][SQL] Parsing seconds 
fraction with variable length for timestamp
URL: https://github.com/apache/spark/pull/27906#issuecomment-599044149
 
 
   Merged build finished. Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578384
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -776,105 +776,128 @@ package object config {
 
   // This property sets the root namespace for metrics reporting
   private[spark] val METRICS_NAMESPACE = 
ConfigBuilder("spark.metrics.namespace")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_CONF = ConfigBuilder("spark.metrics.conf")
+.version("0.8.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_EXECUTORMETRICS_SOURCE_ENABLED =
 ConfigBuilder("spark.metrics.executorMetricsSource.enabled")
   .doc("Whether to register the ExecutorMetrics source with the metrics 
system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val METRICS_STATIC_SOURCES_ENABLED =
 ConfigBuilder("spark.metrics.staticSources.enabled")
   .doc("Whether to register static sources with the metrics system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val PYSPARK_DRIVER_PYTHON = 
ConfigBuilder("spark.pyspark.driver.python")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val PYSPARK_PYTHON = ConfigBuilder("spark.pyspark.python")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   // To limit how many applications are shown in the History Server summary ui
   private[spark] val HISTORY_UI_MAX_APPS =
-
ConfigBuilder("spark.history.ui.maxApplications").intConf.createWithDefault(Integer.MAX_VALUE)
+ConfigBuilder("spark.history.ui.maxApplications")
+  .version("2.0.1")
+  .intConf
+  .createWithDefault(Integer.MAX_VALUE)
 
   private[spark] val IO_ENCRYPTION_ENABLED = 
ConfigBuilder("spark.io.encryption.enabled")
+.version("2.1.0")
 .booleanConf
 .createWithDefault(false)
 
   private[spark] val IO_ENCRYPTION_KEYGEN_ALGORITHM =
 ConfigBuilder("spark.io.encryption.keygen.algorithm")
+  .version("2.1.0")
   .stringConf
   .createWithDefault("HmacSHA1")
 
   private[spark] val IO_ENCRYPTION_KEY_SIZE_BITS = 
ConfigBuilder("spark.io.encryption.keySizeBits")
+.version("2.1.0")
 .intConf
 .checkValues(Set(128, 192, 256))
 .createWithDefault(128)
 
   private[spark] val IO_CRYPTO_CIPHER_TRANSFORMATION =
 ConfigBuilder("spark.io.crypto.cipher.transformation")
   .internal()
+  .version("2.1.0")
   .stringConf
   .createWithDefaultString("AES/CTR/NoPadding")
 
   private[spark] val DRIVER_HOST_ADDRESS = ConfigBuilder("spark.driver.host")
 .doc("Address of driver endpoints.")
+.version("0.7.0")
 .stringConf
 .createWithDefault(Utils.localCanonicalHostName())
 
   private[spark] val DRIVER_PORT = ConfigBuilder("spark.driver.port")
 .doc("Port of driver endpoints.")
+.version("0.7.0")
 .intConf
 .createWithDefault(0)
 
   private[spark] val DRIVER_SUPERVISE = ConfigBuilder("spark.driver.supervise")
 .doc("If true, restarts the driver automatically if it fails with a 
non-zero exit status. " +
   "Only has effect in Spark standalone mode or Mesos cluster deploy mode.")
+.version("1.3.0")
 .booleanConf
 .createWithDefault(false)
 
   private[spark] val DRIVER_BIND_ADDRESS = 
ConfigBuilder("spark.driver.bindAddress")
 .doc("Address where to bind network listen sockets on the driver.")
+.version("2.1.0")
 .fallbackConf(DRIVER_HOST_ADDRESS)
 
   private[spark] val BLOCK_MANAGER_PORT = 
ConfigBuilder("spark.blockManager.port")
 .doc("Port to use for the block manager when a more specific setting is 
not provided.")
+.version("1.1.0")
 .intConf
 .createWithDefault(0)
 
   private[spark] val DRIVER_BLOCK_MANAGER_PORT = 
ConfigBuilder("spark.driver.blockManager.port")
 .doc("Port to use for the block manager on the driver.")
+.version("2.1.0")
 .fallbackConf(BLOCK_MANAGER_PORT)
 
   private[spark] val IGNORE_CORRUPT_FILES = 
ConfigBuilder("spark.files.ignoreCorruptFiles")
 .doc("Whether to ignore corrupt files. If true, the Spark jobs will 
continue to run when " +
   "encountering corrupted or non-existing files and contents that have 
been read will still " +
   "be returned.")
+.version("2.1.0")
 .booleanConf
 .createWithDefault(false)
 
   private[spark] val IGNORE_MISSING_FILES = 
ConfigBuilder("spark.files.ignoreMissingFiles")
 .doc("Whether to ignore missing files. If true, the Spark jobs will 
continue to run when " +
   "encountering missing files and the contents that have been read will 
still be returned.")
+.version("2.4.0")
 
 Review comment:
   SPARK-22676, commit ID: 
ed4101d29f50d54fd7846421e4c00e9ecd3599d0#diff-6bdad48cfc34314e89

[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578403
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -776,105 +776,128 @@ package object config {
 
   // This property sets the root namespace for metrics reporting
   private[spark] val METRICS_NAMESPACE = 
ConfigBuilder("spark.metrics.namespace")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_CONF = ConfigBuilder("spark.metrics.conf")
+.version("0.8.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_EXECUTORMETRICS_SOURCE_ENABLED =
 ConfigBuilder("spark.metrics.executorMetricsSource.enabled")
   .doc("Whether to register the ExecutorMetrics source with the metrics 
system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val METRICS_STATIC_SOURCES_ENABLED =
 ConfigBuilder("spark.metrics.staticSources.enabled")
   .doc("Whether to register static sources with the metrics system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val PYSPARK_DRIVER_PYTHON = 
ConfigBuilder("spark.pyspark.driver.python")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val PYSPARK_PYTHON = ConfigBuilder("spark.pyspark.python")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   // To limit how many applications are shown in the History Server summary ui
   private[spark] val HISTORY_UI_MAX_APPS =
-
ConfigBuilder("spark.history.ui.maxApplications").intConf.createWithDefault(Integer.MAX_VALUE)
+ConfigBuilder("spark.history.ui.maxApplications")
+  .version("2.0.1")
+  .intConf
+  .createWithDefault(Integer.MAX_VALUE)
 
   private[spark] val IO_ENCRYPTION_ENABLED = 
ConfigBuilder("spark.io.encryption.enabled")
+.version("2.1.0")
 .booleanConf
 .createWithDefault(false)
 
   private[spark] val IO_ENCRYPTION_KEYGEN_ALGORITHM =
 ConfigBuilder("spark.io.encryption.keygen.algorithm")
+  .version("2.1.0")
   .stringConf
   .createWithDefault("HmacSHA1")
 
   private[spark] val IO_ENCRYPTION_KEY_SIZE_BITS = 
ConfigBuilder("spark.io.encryption.keySizeBits")
+.version("2.1.0")
 .intConf
 .checkValues(Set(128, 192, 256))
 .createWithDefault(128)
 
   private[spark] val IO_CRYPTO_CIPHER_TRANSFORMATION =
 ConfigBuilder("spark.io.crypto.cipher.transformation")
   .internal()
+  .version("2.1.0")
   .stringConf
   .createWithDefaultString("AES/CTR/NoPadding")
 
   private[spark] val DRIVER_HOST_ADDRESS = ConfigBuilder("spark.driver.host")
 .doc("Address of driver endpoints.")
+.version("0.7.0")
 .stringConf
 .createWithDefault(Utils.localCanonicalHostName())
 
   private[spark] val DRIVER_PORT = ConfigBuilder("spark.driver.port")
 .doc("Port of driver endpoints.")
+.version("0.7.0")
 .intConf
 .createWithDefault(0)
 
   private[spark] val DRIVER_SUPERVISE = ConfigBuilder("spark.driver.supervise")
 .doc("If true, restarts the driver automatically if it fails with a 
non-zero exit status. " +
   "Only has effect in Spark standalone mode or Mesos cluster deploy mode.")
+.version("1.3.0")
 .booleanConf
 .createWithDefault(false)
 
   private[spark] val DRIVER_BIND_ADDRESS = 
ConfigBuilder("spark.driver.bindAddress")
 .doc("Address where to bind network listen sockets on the driver.")
+.version("2.1.0")
 .fallbackConf(DRIVER_HOST_ADDRESS)
 
   private[spark] val BLOCK_MANAGER_PORT = 
ConfigBuilder("spark.blockManager.port")
 .doc("Port to use for the block manager when a more specific setting is 
not provided.")
+.version("1.1.0")
 .intConf
 .createWithDefault(0)
 
   private[spark] val DRIVER_BLOCK_MANAGER_PORT = 
ConfigBuilder("spark.driver.blockManager.port")
 .doc("Port to use for the block manager on the driver.")
+.version("2.1.0")
 .fallbackConf(BLOCK_MANAGER_PORT)
 
   private[spark] val IGNORE_CORRUPT_FILES = 
ConfigBuilder("spark.files.ignoreCorruptFiles")
 .doc("Whether to ignore corrupt files. If true, the Spark jobs will 
continue to run when " +
   "encountering corrupted or non-existing files and contents that have 
been read will still " +
   "be returned.")
+.version("2.1.0")
 .booleanConf
 .createWithDefault(false)
 
   private[spark] val IGNORE_MISSING_FILES = 
ConfigBuilder("spark.files.ignoreMissingFiles")
 .doc("Whether to ignore missing files. If true, the Spark jobs will 
continue to run when " +
   "encountering missing files and the contents that have been read will 
still be returned.")
+.version("2.4.0")
 .booleanConf
 .createWithDefault(false)
 
   private[spark] val APP_CALLER_CONTEXT = 
ConfigBuilder("s

[GitHub] [spark] SparkQA removed a comment on issue #27906: [SPARK-31150][SQL] Parsing seconds fraction with variable length for timestamp

2020-03-14 Thread GitBox
SparkQA removed a comment on issue #27906: [SPARK-31150][SQL] Parsing seconds 
fraction with variable length for timestamp
URL: https://github.com/apache/spark/pull/27906#issuecomment-599022583
 
 
   **[Test build #119790 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/119790/testReport)**
 for PR 27906 at commit 
[`b135dd5`](https://github.com/apache/spark/commit/b135dd50e42446ce277c1c6e04477b3a8ca64427).


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578371
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -776,105 +776,128 @@ package object config {
 
   // This property sets the root namespace for metrics reporting
   private[spark] val METRICS_NAMESPACE = 
ConfigBuilder("spark.metrics.namespace")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_CONF = ConfigBuilder("spark.metrics.conf")
+.version("0.8.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_EXECUTORMETRICS_SOURCE_ENABLED =
 ConfigBuilder("spark.metrics.executorMetricsSource.enabled")
   .doc("Whether to register the ExecutorMetrics source with the metrics 
system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val METRICS_STATIC_SOURCES_ENABLED =
 ConfigBuilder("spark.metrics.staticSources.enabled")
   .doc("Whether to register static sources with the metrics system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val PYSPARK_DRIVER_PYTHON = 
ConfigBuilder("spark.pyspark.driver.python")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val PYSPARK_PYTHON = ConfigBuilder("spark.pyspark.python")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   // To limit how many applications are shown in the History Server summary ui
   private[spark] val HISTORY_UI_MAX_APPS =
-
ConfigBuilder("spark.history.ui.maxApplications").intConf.createWithDefault(Integer.MAX_VALUE)
+ConfigBuilder("spark.history.ui.maxApplications")
+  .version("2.0.1")
+  .intConf
+  .createWithDefault(Integer.MAX_VALUE)
 
   private[spark] val IO_ENCRYPTION_ENABLED = 
ConfigBuilder("spark.io.encryption.enabled")
+.version("2.1.0")
 .booleanConf
 .createWithDefault(false)
 
   private[spark] val IO_ENCRYPTION_KEYGEN_ALGORITHM =
 ConfigBuilder("spark.io.encryption.keygen.algorithm")
+  .version("2.1.0")
   .stringConf
   .createWithDefault("HmacSHA1")
 
   private[spark] val IO_ENCRYPTION_KEY_SIZE_BITS = 
ConfigBuilder("spark.io.encryption.keySizeBits")
+.version("2.1.0")
 .intConf
 .checkValues(Set(128, 192, 256))
 .createWithDefault(128)
 
   private[spark] val IO_CRYPTO_CIPHER_TRANSFORMATION =
 ConfigBuilder("spark.io.crypto.cipher.transformation")
   .internal()
+  .version("2.1.0")
   .stringConf
   .createWithDefaultString("AES/CTR/NoPadding")
 
   private[spark] val DRIVER_HOST_ADDRESS = ConfigBuilder("spark.driver.host")
 .doc("Address of driver endpoints.")
+.version("0.7.0")
 .stringConf
 .createWithDefault(Utils.localCanonicalHostName())
 
   private[spark] val DRIVER_PORT = ConfigBuilder("spark.driver.port")
 .doc("Port of driver endpoints.")
+.version("0.7.0")
 .intConf
 .createWithDefault(0)
 
   private[spark] val DRIVER_SUPERVISE = ConfigBuilder("spark.driver.supervise")
 .doc("If true, restarts the driver automatically if it fails with a 
non-zero exit status. " +
   "Only has effect in Spark standalone mode or Mesos cluster deploy mode.")
+.version("1.3.0")
 .booleanConf
 .createWithDefault(false)
 
   private[spark] val DRIVER_BIND_ADDRESS = 
ConfigBuilder("spark.driver.bindAddress")
 .doc("Address where to bind network listen sockets on the driver.")
+.version("2.1.0")
 .fallbackConf(DRIVER_HOST_ADDRESS)
 
   private[spark] val BLOCK_MANAGER_PORT = 
ConfigBuilder("spark.blockManager.port")
 .doc("Port to use for the block manager when a more specific setting is 
not provided.")
+.version("1.1.0")
 .intConf
 .createWithDefault(0)
 
   private[spark] val DRIVER_BLOCK_MANAGER_PORT = 
ConfigBuilder("spark.driver.blockManager.port")
 .doc("Port to use for the block manager on the driver.")
+.version("2.1.0")
 .fallbackConf(BLOCK_MANAGER_PORT)
 
   private[spark] val IGNORE_CORRUPT_FILES = 
ConfigBuilder("spark.files.ignoreCorruptFiles")
 .doc("Whether to ignore corrupt files. If true, the Spark jobs will 
continue to run when " +
   "encountering corrupted or non-existing files and contents that have 
been read will still " +
   "be returned.")
+.version("2.1.0")
 
 Review comment:
   SPARK-17850, commit ID: 
47776e7c0c68590fe446cef910900b1aaead06f9#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Servi

[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578333
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -776,105 +776,128 @@ package object config {
 
   // This property sets the root namespace for metrics reporting
   private[spark] val METRICS_NAMESPACE = 
ConfigBuilder("spark.metrics.namespace")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_CONF = ConfigBuilder("spark.metrics.conf")
+.version("0.8.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_EXECUTORMETRICS_SOURCE_ENABLED =
 ConfigBuilder("spark.metrics.executorMetricsSource.enabled")
   .doc("Whether to register the ExecutorMetrics source with the metrics 
system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val METRICS_STATIC_SOURCES_ENABLED =
 ConfigBuilder("spark.metrics.staticSources.enabled")
   .doc("Whether to register static sources with the metrics system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val PYSPARK_DRIVER_PYTHON = 
ConfigBuilder("spark.pyspark.driver.python")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val PYSPARK_PYTHON = ConfigBuilder("spark.pyspark.python")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   // To limit how many applications are shown in the History Server summary ui
   private[spark] val HISTORY_UI_MAX_APPS =
-
ConfigBuilder("spark.history.ui.maxApplications").intConf.createWithDefault(Integer.MAX_VALUE)
+ConfigBuilder("spark.history.ui.maxApplications")
+  .version("2.0.1")
+  .intConf
+  .createWithDefault(Integer.MAX_VALUE)
 
   private[spark] val IO_ENCRYPTION_ENABLED = 
ConfigBuilder("spark.io.encryption.enabled")
+.version("2.1.0")
 .booleanConf
 .createWithDefault(false)
 
   private[spark] val IO_ENCRYPTION_KEYGEN_ALGORITHM =
 ConfigBuilder("spark.io.encryption.keygen.algorithm")
+  .version("2.1.0")
   .stringConf
   .createWithDefault("HmacSHA1")
 
   private[spark] val IO_ENCRYPTION_KEY_SIZE_BITS = 
ConfigBuilder("spark.io.encryption.keySizeBits")
+.version("2.1.0")
 .intConf
 .checkValues(Set(128, 192, 256))
 .createWithDefault(128)
 
   private[spark] val IO_CRYPTO_CIPHER_TRANSFORMATION =
 ConfigBuilder("spark.io.crypto.cipher.transformation")
   .internal()
+  .version("2.1.0")
   .stringConf
   .createWithDefaultString("AES/CTR/NoPadding")
 
   private[spark] val DRIVER_HOST_ADDRESS = ConfigBuilder("spark.driver.host")
 .doc("Address of driver endpoints.")
+.version("0.7.0")
 .stringConf
 .createWithDefault(Utils.localCanonicalHostName())
 
   private[spark] val DRIVER_PORT = ConfigBuilder("spark.driver.port")
 .doc("Port of driver endpoints.")
+.version("0.7.0")
 .intConf
 .createWithDefault(0)
 
   private[spark] val DRIVER_SUPERVISE = ConfigBuilder("spark.driver.supervise")
 .doc("If true, restarts the driver automatically if it fails with a 
non-zero exit status. " +
   "Only has effect in Spark standalone mode or Mesos cluster deploy mode.")
+.version("1.3.0")
 .booleanConf
 .createWithDefault(false)
 
   private[spark] val DRIVER_BIND_ADDRESS = 
ConfigBuilder("spark.driver.bindAddress")
 .doc("Address where to bind network listen sockets on the driver.")
+.version("2.1.0")
 .fallbackConf(DRIVER_HOST_ADDRESS)
 
   private[spark] val BLOCK_MANAGER_PORT = 
ConfigBuilder("spark.blockManager.port")
 .doc("Port to use for the block manager when a more specific setting is 
not provided.")
+.version("1.1.0")
 .intConf
 .createWithDefault(0)
 
   private[spark] val DRIVER_BLOCK_MANAGER_PORT = 
ConfigBuilder("spark.driver.blockManager.port")
 .doc("Port to use for the block manager on the driver.")
+.version("2.1.0")
 
 Review comment:
   SPARK-4563, commit ID: 
2cd1bfa4f0c6625b0ab1dbeba2b9586b9a6a9f42#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578345
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -776,105 +776,128 @@ package object config {
 
   // This property sets the root namespace for metrics reporting
   private[spark] val METRICS_NAMESPACE = 
ConfigBuilder("spark.metrics.namespace")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_CONF = ConfigBuilder("spark.metrics.conf")
+.version("0.8.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_EXECUTORMETRICS_SOURCE_ENABLED =
 ConfigBuilder("spark.metrics.executorMetricsSource.enabled")
   .doc("Whether to register the ExecutorMetrics source with the metrics 
system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val METRICS_STATIC_SOURCES_ENABLED =
 ConfigBuilder("spark.metrics.staticSources.enabled")
   .doc("Whether to register static sources with the metrics system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val PYSPARK_DRIVER_PYTHON = 
ConfigBuilder("spark.pyspark.driver.python")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val PYSPARK_PYTHON = ConfigBuilder("spark.pyspark.python")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   // To limit how many applications are shown in the History Server summary ui
   private[spark] val HISTORY_UI_MAX_APPS =
-
ConfigBuilder("spark.history.ui.maxApplications").intConf.createWithDefault(Integer.MAX_VALUE)
+ConfigBuilder("spark.history.ui.maxApplications")
+  .version("2.0.1")
+  .intConf
+  .createWithDefault(Integer.MAX_VALUE)
 
   private[spark] val IO_ENCRYPTION_ENABLED = 
ConfigBuilder("spark.io.encryption.enabled")
+.version("2.1.0")
 .booleanConf
 .createWithDefault(false)
 
   private[spark] val IO_ENCRYPTION_KEYGEN_ALGORITHM =
 ConfigBuilder("spark.io.encryption.keygen.algorithm")
+  .version("2.1.0")
   .stringConf
   .createWithDefault("HmacSHA1")
 
   private[spark] val IO_ENCRYPTION_KEY_SIZE_BITS = 
ConfigBuilder("spark.io.encryption.keySizeBits")
+.version("2.1.0")
 .intConf
 .checkValues(Set(128, 192, 256))
 .createWithDefault(128)
 
   private[spark] val IO_CRYPTO_CIPHER_TRANSFORMATION =
 ConfigBuilder("spark.io.crypto.cipher.transformation")
   .internal()
+  .version("2.1.0")
   .stringConf
   .createWithDefaultString("AES/CTR/NoPadding")
 
   private[spark] val DRIVER_HOST_ADDRESS = ConfigBuilder("spark.driver.host")
 .doc("Address of driver endpoints.")
+.version("0.7.0")
 .stringConf
 .createWithDefault(Utils.localCanonicalHostName())
 
   private[spark] val DRIVER_PORT = ConfigBuilder("spark.driver.port")
 .doc("Port of driver endpoints.")
+.version("0.7.0")
 .intConf
 .createWithDefault(0)
 
   private[spark] val DRIVER_SUPERVISE = ConfigBuilder("spark.driver.supervise")
 .doc("If true, restarts the driver automatically if it fails with a 
non-zero exit status. " +
   "Only has effect in Spark standalone mode or Mesos cluster deploy mode.")
+.version("1.3.0")
 .booleanConf
 .createWithDefault(false)
 
   private[spark] val DRIVER_BIND_ADDRESS = 
ConfigBuilder("spark.driver.bindAddress")
 .doc("Address where to bind network listen sockets on the driver.")
+.version("2.1.0")
 .fallbackConf(DRIVER_HOST_ADDRESS)
 
   private[spark] val BLOCK_MANAGER_PORT = 
ConfigBuilder("spark.blockManager.port")
 .doc("Port to use for the block manager when a more specific setting is 
not provided.")
+.version("1.1.0")
 
 Review comment:
   SPARK-2157, commit ID: 
31090e43ca91f687b0bc6e25c824dc25bd7027cd#diff-2b643ea78c1add0381754b1f47eec132


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] SparkQA commented on issue #27906: [SPARK-31150][SQL] Parsing seconds fraction with variable length for timestamp

2020-03-14 Thread GitBox
SparkQA commented on issue #27906: [SPARK-31150][SQL] Parsing seconds fraction 
with variable length for timestamp
URL: https://github.com/apache/spark/pull/27906#issuecomment-599043918
 
 
   **[Test build #119790 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/119790/testReport)**
 for PR 27906 at commit 
[`b135dd5`](https://github.com/apache/spark/commit/b135dd50e42446ce277c1c6e04477b3a8ca64427).
* This patch passes all tests.
* This patch merges cleanly.
* This patch adds no public classes.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578291
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -776,105 +776,128 @@ package object config {
 
   // This property sets the root namespace for metrics reporting
   private[spark] val METRICS_NAMESPACE = 
ConfigBuilder("spark.metrics.namespace")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_CONF = ConfigBuilder("spark.metrics.conf")
+.version("0.8.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_EXECUTORMETRICS_SOURCE_ENABLED =
 ConfigBuilder("spark.metrics.executorMetricsSource.enabled")
   .doc("Whether to register the ExecutorMetrics source with the metrics 
system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val METRICS_STATIC_SOURCES_ENABLED =
 ConfigBuilder("spark.metrics.staticSources.enabled")
   .doc("Whether to register static sources with the metrics system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val PYSPARK_DRIVER_PYTHON = 
ConfigBuilder("spark.pyspark.driver.python")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val PYSPARK_PYTHON = ConfigBuilder("spark.pyspark.python")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   // To limit how many applications are shown in the History Server summary ui
   private[spark] val HISTORY_UI_MAX_APPS =
-
ConfigBuilder("spark.history.ui.maxApplications").intConf.createWithDefault(Integer.MAX_VALUE)
+ConfigBuilder("spark.history.ui.maxApplications")
+  .version("2.0.1")
+  .intConf
+  .createWithDefault(Integer.MAX_VALUE)
 
   private[spark] val IO_ENCRYPTION_ENABLED = 
ConfigBuilder("spark.io.encryption.enabled")
+.version("2.1.0")
 .booleanConf
 .createWithDefault(false)
 
   private[spark] val IO_ENCRYPTION_KEYGEN_ALGORITHM =
 ConfigBuilder("spark.io.encryption.keygen.algorithm")
+  .version("2.1.0")
   .stringConf
   .createWithDefault("HmacSHA1")
 
   private[spark] val IO_ENCRYPTION_KEY_SIZE_BITS = 
ConfigBuilder("spark.io.encryption.keySizeBits")
+.version("2.1.0")
 .intConf
 .checkValues(Set(128, 192, 256))
 .createWithDefault(128)
 
   private[spark] val IO_CRYPTO_CIPHER_TRANSFORMATION =
 ConfigBuilder("spark.io.crypto.cipher.transformation")
   .internal()
+  .version("2.1.0")
   .stringConf
   .createWithDefaultString("AES/CTR/NoPadding")
 
   private[spark] val DRIVER_HOST_ADDRESS = ConfigBuilder("spark.driver.host")
 .doc("Address of driver endpoints.")
+.version("0.7.0")
 .stringConf
 .createWithDefault(Utils.localCanonicalHostName())
 
   private[spark] val DRIVER_PORT = ConfigBuilder("spark.driver.port")
 .doc("Port of driver endpoints.")
+.version("0.7.0")
 .intConf
 .createWithDefault(0)
 
   private[spark] val DRIVER_SUPERVISE = ConfigBuilder("spark.driver.supervise")
 .doc("If true, restarts the driver automatically if it fails with a 
non-zero exit status. " +
   "Only has effect in Spark standalone mode or Mesos cluster deploy mode.")
+.version("1.3.0")
 
 Review comment:
   SPARK-5388, commit ID: 
6ec0cdc14390d4dc45acf31040f21e1efc476fc0#diff-4d2ab44195558d5a9d5f15b8803ef39d


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578255
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -776,105 +776,128 @@ package object config {
 
   // This property sets the root namespace for metrics reporting
   private[spark] val METRICS_NAMESPACE = 
ConfigBuilder("spark.metrics.namespace")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_CONF = ConfigBuilder("spark.metrics.conf")
+.version("0.8.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_EXECUTORMETRICS_SOURCE_ENABLED =
 ConfigBuilder("spark.metrics.executorMetricsSource.enabled")
   .doc("Whether to register the ExecutorMetrics source with the metrics 
system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val METRICS_STATIC_SOURCES_ENABLED =
 ConfigBuilder("spark.metrics.staticSources.enabled")
   .doc("Whether to register static sources with the metrics system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val PYSPARK_DRIVER_PYTHON = 
ConfigBuilder("spark.pyspark.driver.python")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val PYSPARK_PYTHON = ConfigBuilder("spark.pyspark.python")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   // To limit how many applications are shown in the History Server summary ui
   private[spark] val HISTORY_UI_MAX_APPS =
-
ConfigBuilder("spark.history.ui.maxApplications").intConf.createWithDefault(Integer.MAX_VALUE)
+ConfigBuilder("spark.history.ui.maxApplications")
+  .version("2.0.1")
+  .intConf
+  .createWithDefault(Integer.MAX_VALUE)
 
   private[spark] val IO_ENCRYPTION_ENABLED = 
ConfigBuilder("spark.io.encryption.enabled")
+.version("2.1.0")
 .booleanConf
 .createWithDefault(false)
 
   private[spark] val IO_ENCRYPTION_KEYGEN_ALGORITHM =
 ConfigBuilder("spark.io.encryption.keygen.algorithm")
+  .version("2.1.0")
   .stringConf
   .createWithDefault("HmacSHA1")
 
   private[spark] val IO_ENCRYPTION_KEY_SIZE_BITS = 
ConfigBuilder("spark.io.encryption.keySizeBits")
+.version("2.1.0")
 .intConf
 .checkValues(Set(128, 192, 256))
 .createWithDefault(128)
 
   private[spark] val IO_CRYPTO_CIPHER_TRANSFORMATION =
 ConfigBuilder("spark.io.crypto.cipher.transformation")
   .internal()
+  .version("2.1.0")
   .stringConf
   .createWithDefaultString("AES/CTR/NoPadding")
 
   private[spark] val DRIVER_HOST_ADDRESS = ConfigBuilder("spark.driver.host")
 .doc("Address of driver endpoints.")
+.version("0.7.0")
 .stringConf
 .createWithDefault(Utils.localCanonicalHostName())
 
   private[spark] val DRIVER_PORT = ConfigBuilder("spark.driver.port")
 .doc("Port of driver endpoints.")
+.version("0.7.0")
 
 Review comment:
   No JIRA ID, commit ID: 
02a6761589c35f15f1a6e3b63a7964ba057d3ba6#diff-eaf125f56ce786d64dcef99cf446a751


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578248
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -776,105 +776,128 @@ package object config {
 
   // This property sets the root namespace for metrics reporting
   private[spark] val METRICS_NAMESPACE = 
ConfigBuilder("spark.metrics.namespace")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_CONF = ConfigBuilder("spark.metrics.conf")
+.version("0.8.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_EXECUTORMETRICS_SOURCE_ENABLED =
 ConfigBuilder("spark.metrics.executorMetricsSource.enabled")
   .doc("Whether to register the ExecutorMetrics source with the metrics 
system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val METRICS_STATIC_SOURCES_ENABLED =
 ConfigBuilder("spark.metrics.staticSources.enabled")
   .doc("Whether to register static sources with the metrics system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val PYSPARK_DRIVER_PYTHON = 
ConfigBuilder("spark.pyspark.driver.python")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val PYSPARK_PYTHON = ConfigBuilder("spark.pyspark.python")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   // To limit how many applications are shown in the History Server summary ui
   private[spark] val HISTORY_UI_MAX_APPS =
-
ConfigBuilder("spark.history.ui.maxApplications").intConf.createWithDefault(Integer.MAX_VALUE)
+ConfigBuilder("spark.history.ui.maxApplications")
+  .version("2.0.1")
+  .intConf
+  .createWithDefault(Integer.MAX_VALUE)
 
   private[spark] val IO_ENCRYPTION_ENABLED = 
ConfigBuilder("spark.io.encryption.enabled")
+.version("2.1.0")
 .booleanConf
 .createWithDefault(false)
 
   private[spark] val IO_ENCRYPTION_KEYGEN_ALGORITHM =
 ConfigBuilder("spark.io.encryption.keygen.algorithm")
+  .version("2.1.0")
   .stringConf
   .createWithDefault("HmacSHA1")
 
   private[spark] val IO_ENCRYPTION_KEY_SIZE_BITS = 
ConfigBuilder("spark.io.encryption.keySizeBits")
+.version("2.1.0")
 .intConf
 .checkValues(Set(128, 192, 256))
 .createWithDefault(128)
 
   private[spark] val IO_CRYPTO_CIPHER_TRANSFORMATION =
 ConfigBuilder("spark.io.crypto.cipher.transformation")
   .internal()
+  .version("2.1.0")
   .stringConf
   .createWithDefaultString("AES/CTR/NoPadding")
 
   private[spark] val DRIVER_HOST_ADDRESS = ConfigBuilder("spark.driver.host")
 .doc("Address of driver endpoints.")
+.version("0.7.0")
 
 Review comment:
   No JIRA ID, commit ID: 
02a6761589c35f15f1a6e3b63a7964ba057d3ba6#diff-eaf125f56ce786d64dcef99cf446a751


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins removed a comment on issue #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
AmplabJenkins removed a comment on issue #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#issuecomment-599043673
 
 
   Merged build finished. Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578322
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -776,105 +776,128 @@ package object config {
 
   // This property sets the root namespace for metrics reporting
   private[spark] val METRICS_NAMESPACE = 
ConfigBuilder("spark.metrics.namespace")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_CONF = ConfigBuilder("spark.metrics.conf")
+.version("0.8.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_EXECUTORMETRICS_SOURCE_ENABLED =
 ConfigBuilder("spark.metrics.executorMetricsSource.enabled")
   .doc("Whether to register the ExecutorMetrics source with the metrics 
system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val METRICS_STATIC_SOURCES_ENABLED =
 ConfigBuilder("spark.metrics.staticSources.enabled")
   .doc("Whether to register static sources with the metrics system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val PYSPARK_DRIVER_PYTHON = 
ConfigBuilder("spark.pyspark.driver.python")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val PYSPARK_PYTHON = ConfigBuilder("spark.pyspark.python")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   // To limit how many applications are shown in the History Server summary ui
   private[spark] val HISTORY_UI_MAX_APPS =
-
ConfigBuilder("spark.history.ui.maxApplications").intConf.createWithDefault(Integer.MAX_VALUE)
+ConfigBuilder("spark.history.ui.maxApplications")
+  .version("2.0.1")
+  .intConf
+  .createWithDefault(Integer.MAX_VALUE)
 
   private[spark] val IO_ENCRYPTION_ENABLED = 
ConfigBuilder("spark.io.encryption.enabled")
+.version("2.1.0")
 .booleanConf
 .createWithDefault(false)
 
   private[spark] val IO_ENCRYPTION_KEYGEN_ALGORITHM =
 ConfigBuilder("spark.io.encryption.keygen.algorithm")
+  .version("2.1.0")
   .stringConf
   .createWithDefault("HmacSHA1")
 
   private[spark] val IO_ENCRYPTION_KEY_SIZE_BITS = 
ConfigBuilder("spark.io.encryption.keySizeBits")
+.version("2.1.0")
 .intConf
 .checkValues(Set(128, 192, 256))
 .createWithDefault(128)
 
   private[spark] val IO_CRYPTO_CIPHER_TRANSFORMATION =
 ConfigBuilder("spark.io.crypto.cipher.transformation")
   .internal()
+  .version("2.1.0")
   .stringConf
   .createWithDefaultString("AES/CTR/NoPadding")
 
   private[spark] val DRIVER_HOST_ADDRESS = ConfigBuilder("spark.driver.host")
 .doc("Address of driver endpoints.")
+.version("0.7.0")
 .stringConf
 .createWithDefault(Utils.localCanonicalHostName())
 
   private[spark] val DRIVER_PORT = ConfigBuilder("spark.driver.port")
 .doc("Port of driver endpoints.")
+.version("0.7.0")
 .intConf
 .createWithDefault(0)
 
   private[spark] val DRIVER_SUPERVISE = ConfigBuilder("spark.driver.supervise")
 .doc("If true, restarts the driver automatically if it fails with a 
non-zero exit status. " +
   "Only has effect in Spark standalone mode or Mesos cluster deploy mode.")
+.version("1.3.0")
 .booleanConf
 .createWithDefault(false)
 
   private[spark] val DRIVER_BIND_ADDRESS = 
ConfigBuilder("spark.driver.bindAddress")
 .doc("Address where to bind network listen sockets on the driver.")
+.version("2.1.0")
 
 Review comment:
   SPARK-4563, commit ID: 
2cd1bfa4f0c6625b0ab1dbeba2b9586b9a6a9f42#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins removed a comment on issue #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
AmplabJenkins removed a comment on issue #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#issuecomment-599043678
 
 
   Test PASSed.
   Refer to this link for build results (access rights to CI server needed): 
   
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder-K8s/24527/
   Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins commented on issue #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
AmplabJenkins commented on issue #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add 
version information to the configuration of Core
URL: https://github.com/apache/spark/pull/27913#issuecomment-599043678
 
 
   Test PASSed.
   Refer to this link for build results (access rights to CI server needed): 
   
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder-K8s/24527/
   Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578208
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -776,105 +776,128 @@ package object config {
 
   // This property sets the root namespace for metrics reporting
   private[spark] val METRICS_NAMESPACE = 
ConfigBuilder("spark.metrics.namespace")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_CONF = ConfigBuilder("spark.metrics.conf")
+.version("0.8.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_EXECUTORMETRICS_SOURCE_ENABLED =
 ConfigBuilder("spark.metrics.executorMetricsSource.enabled")
   .doc("Whether to register the ExecutorMetrics source with the metrics 
system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val METRICS_STATIC_SOURCES_ENABLED =
 ConfigBuilder("spark.metrics.staticSources.enabled")
   .doc("Whether to register static sources with the metrics system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val PYSPARK_DRIVER_PYTHON = 
ConfigBuilder("spark.pyspark.driver.python")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val PYSPARK_PYTHON = ConfigBuilder("spark.pyspark.python")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   // To limit how many applications are shown in the History Server summary ui
   private[spark] val HISTORY_UI_MAX_APPS =
-
ConfigBuilder("spark.history.ui.maxApplications").intConf.createWithDefault(Integer.MAX_VALUE)
+ConfigBuilder("spark.history.ui.maxApplications")
+  .version("2.0.1")
+  .intConf
+  .createWithDefault(Integer.MAX_VALUE)
 
   private[spark] val IO_ENCRYPTION_ENABLED = 
ConfigBuilder("spark.io.encryption.enabled")
+.version("2.1.0")
 
 Review comment:
   SPARK-5682, commit ID: 
4b4e329e49f8af28fa6301bd06c48d7097eaf9e6#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578217
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -776,105 +776,128 @@ package object config {
 
   // This property sets the root namespace for metrics reporting
   private[spark] val METRICS_NAMESPACE = 
ConfigBuilder("spark.metrics.namespace")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_CONF = ConfigBuilder("spark.metrics.conf")
+.version("0.8.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_EXECUTORMETRICS_SOURCE_ENABLED =
 ConfigBuilder("spark.metrics.executorMetricsSource.enabled")
   .doc("Whether to register the ExecutorMetrics source with the metrics 
system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val METRICS_STATIC_SOURCES_ENABLED =
 ConfigBuilder("spark.metrics.staticSources.enabled")
   .doc("Whether to register static sources with the metrics system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val PYSPARK_DRIVER_PYTHON = 
ConfigBuilder("spark.pyspark.driver.python")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val PYSPARK_PYTHON = ConfigBuilder("spark.pyspark.python")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   // To limit how many applications are shown in the History Server summary ui
   private[spark] val HISTORY_UI_MAX_APPS =
-
ConfigBuilder("spark.history.ui.maxApplications").intConf.createWithDefault(Integer.MAX_VALUE)
+ConfigBuilder("spark.history.ui.maxApplications")
+  .version("2.0.1")
+  .intConf
+  .createWithDefault(Integer.MAX_VALUE)
 
   private[spark] val IO_ENCRYPTION_ENABLED = 
ConfigBuilder("spark.io.encryption.enabled")
+.version("2.1.0")
 .booleanConf
 .createWithDefault(false)
 
   private[spark] val IO_ENCRYPTION_KEYGEN_ALGORITHM =
 ConfigBuilder("spark.io.encryption.keygen.algorithm")
+  .version("2.1.0")
 
 Review comment:
   SPARK-5682, commit ID: 
4b4e329e49f8af28fa6301bd06c48d7097eaf9e6#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578227
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -776,105 +776,128 @@ package object config {
 
   // This property sets the root namespace for metrics reporting
   private[spark] val METRICS_NAMESPACE = 
ConfigBuilder("spark.metrics.namespace")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_CONF = ConfigBuilder("spark.metrics.conf")
+.version("0.8.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_EXECUTORMETRICS_SOURCE_ENABLED =
 ConfigBuilder("spark.metrics.executorMetricsSource.enabled")
   .doc("Whether to register the ExecutorMetrics source with the metrics 
system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val METRICS_STATIC_SOURCES_ENABLED =
 ConfigBuilder("spark.metrics.staticSources.enabled")
   .doc("Whether to register static sources with the metrics system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val PYSPARK_DRIVER_PYTHON = 
ConfigBuilder("spark.pyspark.driver.python")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val PYSPARK_PYTHON = ConfigBuilder("spark.pyspark.python")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   // To limit how many applications are shown in the History Server summary ui
   private[spark] val HISTORY_UI_MAX_APPS =
-
ConfigBuilder("spark.history.ui.maxApplications").intConf.createWithDefault(Integer.MAX_VALUE)
+ConfigBuilder("spark.history.ui.maxApplications")
+  .version("2.0.1")
+  .intConf
+  .createWithDefault(Integer.MAX_VALUE)
 
   private[spark] val IO_ENCRYPTION_ENABLED = 
ConfigBuilder("spark.io.encryption.enabled")
+.version("2.1.0")
 .booleanConf
 .createWithDefault(false)
 
   private[spark] val IO_ENCRYPTION_KEYGEN_ALGORITHM =
 ConfigBuilder("spark.io.encryption.keygen.algorithm")
+  .version("2.1.0")
   .stringConf
   .createWithDefault("HmacSHA1")
 
   private[spark] val IO_ENCRYPTION_KEY_SIZE_BITS = 
ConfigBuilder("spark.io.encryption.keySizeBits")
+.version("2.1.0")
 .intConf
 .checkValues(Set(128, 192, 256))
 .createWithDefault(128)
 
   private[spark] val IO_CRYPTO_CIPHER_TRANSFORMATION =
 ConfigBuilder("spark.io.crypto.cipher.transformation")
   .internal()
+  .version("2.1.0")
 
 Review comment:
   SPARK-5682, commit ID: 
4b4e329e49f8af28fa6301bd06c48d7097eaf9e6#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578223
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -776,105 +776,128 @@ package object config {
 
   // This property sets the root namespace for metrics reporting
   private[spark] val METRICS_NAMESPACE = 
ConfigBuilder("spark.metrics.namespace")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_CONF = ConfigBuilder("spark.metrics.conf")
+.version("0.8.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_EXECUTORMETRICS_SOURCE_ENABLED =
 ConfigBuilder("spark.metrics.executorMetricsSource.enabled")
   .doc("Whether to register the ExecutorMetrics source with the metrics 
system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val METRICS_STATIC_SOURCES_ENABLED =
 ConfigBuilder("spark.metrics.staticSources.enabled")
   .doc("Whether to register static sources with the metrics system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val PYSPARK_DRIVER_PYTHON = 
ConfigBuilder("spark.pyspark.driver.python")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val PYSPARK_PYTHON = ConfigBuilder("spark.pyspark.python")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   // To limit how many applications are shown in the History Server summary ui
   private[spark] val HISTORY_UI_MAX_APPS =
-
ConfigBuilder("spark.history.ui.maxApplications").intConf.createWithDefault(Integer.MAX_VALUE)
+ConfigBuilder("spark.history.ui.maxApplications")
+  .version("2.0.1")
+  .intConf
+  .createWithDefault(Integer.MAX_VALUE)
 
   private[spark] val IO_ENCRYPTION_ENABLED = 
ConfigBuilder("spark.io.encryption.enabled")
+.version("2.1.0")
 .booleanConf
 .createWithDefault(false)
 
   private[spark] val IO_ENCRYPTION_KEYGEN_ALGORITHM =
 ConfigBuilder("spark.io.encryption.keygen.algorithm")
+  .version("2.1.0")
   .stringConf
   .createWithDefault("HmacSHA1")
 
   private[spark] val IO_ENCRYPTION_KEY_SIZE_BITS = 
ConfigBuilder("spark.io.encryption.keySizeBits")
+.version("2.1.0")
 
 Review comment:
   SPARK-5682, commit ID: 
4b4e329e49f8af28fa6301bd06c48d7097eaf9e6#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] AmplabJenkins commented on issue #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
AmplabJenkins commented on issue #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add 
version information to the configuration of Core
URL: https://github.com/apache/spark/pull/27913#issuecomment-599043673
 
 
   Merged build finished. Test PASSed.


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578138
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -776,105 +776,128 @@ package object config {
 
   // This property sets the root namespace for metrics reporting
   private[spark] val METRICS_NAMESPACE = 
ConfigBuilder("spark.metrics.namespace")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_CONF = ConfigBuilder("spark.metrics.conf")
+.version("0.8.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_EXECUTORMETRICS_SOURCE_ENABLED =
 ConfigBuilder("spark.metrics.executorMetricsSource.enabled")
   .doc("Whether to register the ExecutorMetrics source with the metrics 
system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val METRICS_STATIC_SOURCES_ENABLED =
 ConfigBuilder("spark.metrics.staticSources.enabled")
   .doc("Whether to register static sources with the metrics system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val PYSPARK_DRIVER_PYTHON = 
ConfigBuilder("spark.pyspark.driver.python")
+.version("2.1.0")
 
 Review comment:
   SPARK-13081, commit ID: 
7a9e25c38380e6c62080d62ad38a4830e44fe753#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578166
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -776,105 +776,128 @@ package object config {
 
   // This property sets the root namespace for metrics reporting
   private[spark] val METRICS_NAMESPACE = 
ConfigBuilder("spark.metrics.namespace")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_CONF = ConfigBuilder("spark.metrics.conf")
+.version("0.8.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_EXECUTORMETRICS_SOURCE_ENABLED =
 ConfigBuilder("spark.metrics.executorMetricsSource.enabled")
   .doc("Whether to register the ExecutorMetrics source with the metrics 
system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val METRICS_STATIC_SOURCES_ENABLED =
 ConfigBuilder("spark.metrics.staticSources.enabled")
   .doc("Whether to register static sources with the metrics system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val PYSPARK_DRIVER_PYTHON = 
ConfigBuilder("spark.pyspark.driver.python")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val PYSPARK_PYTHON = ConfigBuilder("spark.pyspark.python")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   // To limit how many applications are shown in the History Server summary ui
   private[spark] val HISTORY_UI_MAX_APPS =
-
ConfigBuilder("spark.history.ui.maxApplications").intConf.createWithDefault(Integer.MAX_VALUE)
+ConfigBuilder("spark.history.ui.maxApplications")
+  .version("2.0.1")
 
 Review comment:
   SPARK-17243, commit ID: 
021aa28f439443cda1bc7c5e3eee7c85b40c1a2d#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578156
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -776,105 +776,128 @@ package object config {
 
   // This property sets the root namespace for metrics reporting
   private[spark] val METRICS_NAMESPACE = 
ConfigBuilder("spark.metrics.namespace")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_CONF = ConfigBuilder("spark.metrics.conf")
+.version("0.8.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_EXECUTORMETRICS_SOURCE_ENABLED =
 ConfigBuilder("spark.metrics.executorMetricsSource.enabled")
   .doc("Whether to register the ExecutorMetrics source with the metrics 
system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val METRICS_STATIC_SOURCES_ENABLED =
 ConfigBuilder("spark.metrics.staticSources.enabled")
   .doc("Whether to register static sources with the metrics system.")
+  .version("3.0.0")
   .booleanConf
   .createWithDefault(true)
 
   private[spark] val PYSPARK_DRIVER_PYTHON = 
ConfigBuilder("spark.pyspark.driver.python")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val PYSPARK_PYTHON = ConfigBuilder("spark.pyspark.python")
+.version("2.1.0")
 
 Review comment:
   SPARK-13081, commit ID: 
7a9e25c38380e6c62080d62ad38a4830e44fe753#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] SparkQA commented on issue #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
SparkQA commented on issue #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add 
version information to the configuration of Core
URL: https://github.com/apache/spark/pull/27913#issuecomment-599043553
 
 
   **[Test build #119797 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/119797/testReport)**
 for PR 27913 at commit 
[`ad58e0c`](https://github.com/apache/spark/commit/ad58e0cc8934f19eefc441f7977b840dfe850c0d).


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] [spark] beliefer commented on a change in pull request #27913: [SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration of Core

2020-03-14 Thread GitBox
beliefer commented on a change in pull request #27913: 
[SPARK-31002][CORE][DOC][FOLLOWUP] Add version information to the configuration 
of Core
URL: https://github.com/apache/spark/pull/27913#discussion_r392578077
 
 

 ##
 File path: core/src/main/scala/org/apache/spark/internal/config/package.scala
 ##
 @@ -776,105 +776,128 @@ package object config {
 
   // This property sets the root namespace for metrics reporting
   private[spark] val METRICS_NAMESPACE = 
ConfigBuilder("spark.metrics.namespace")
+.version("2.1.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_CONF = ConfigBuilder("spark.metrics.conf")
+.version("0.8.0")
 .stringConf
 .createOptional
 
   private[spark] val METRICS_EXECUTORMETRICS_SOURCE_ENABLED =
 ConfigBuilder("spark.metrics.executorMetricsSource.enabled")
   .doc("Whether to register the ExecutorMetrics source with the metrics 
system.")
+  .version("3.0.0")
 
 Review comment:
   SPARK-27189, commit ID: 
729f43f499f3dd2718c0b28d73f2ca29cc811eac#diff-6bdad48cfc34314e89599655442ff210


This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



<    1   2   3   4   5   >