[GitHub] spark pull request #20269: [SPARK-23029] [DOCS] Specifying default units of ...
Github user Ngone51 commented on a diff in the pull request: https://github.com/apache/spark/pull/20269#discussion_r238135789 --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala --- @@ -38,10 +38,13 @@ package object config { ConfigBuilder("spark.driver.userClassPathFirst").booleanConf.createWithDefault(false) private[spark] val DRIVER_MEMORY = ConfigBuilder("spark.driver.memory") +.doc("Amount of memory to use for the driver process, in MiB unless otherwise specified.") .bytesConf(ByteUnit.MiB) .createWithDefaultString("1g") private[spark] val DRIVER_MEMORY_OVERHEAD = ConfigBuilder("spark.driver.memoryOverhead") +.doc("The amount of off-heap memory to be allocated per driver in cluster mode, " + --- End diff -- Hi, @ferdonline , can you explain why this is **off-heap** memory ? --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #20269: [SPARK-23029] [DOCS] Specifying default units of ...
Github user asfgit closed the pull request at: https://github.com/apache/spark/pull/20269 --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #20269: [SPARK-23029] [DOCS] Specifying default units of ...
Github user srowen commented on a diff in the pull request: https://github.com/apache/spark/pull/20269#discussion_r161608661 --- Diff: core/src/main/scala/org/apache/spark/internal/config/package.scala --- @@ -419,7 +419,7 @@ package object config { private[spark] val SHUFFLE_FILE_BUFFER_SIZE = ConfigBuilder("spark.shuffle.file.buffer") - .doc("Size of the in-memory buffer for each shuffle file output stream. " + + .doc("Size (in KiB) of the in-memory buffer for each shuffle file output stream. " + --- End diff -- Really, "in KiB unless otherwise specified"? Same for the next property below. These two are the only two that aren't in bytes by default, and have a description already. It would be handy to add a blurb about this to all of the "MiB" default properties above this too, for consistency. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #20269: [SPARK-23029] [DOCS] Specifying default units of ...
Github user srowen commented on a diff in the pull request: https://github.com/apache/spark/pull/20269#discussion_r161609423 --- Diff: docs/configuration.md --- @@ -150,6 +152,7 @@ of the most common options to set are: Amount of memory to use for the driver process, i.e. where SparkContext is initialized. (e.g. 1g, 2g). +Default unit: MiB --- End diff -- Everywhere the default isn't bytes, a clause like ", in MiB unless otherwise specified", seems cleanest. There are 9 such properties as far as I can tell. Although it would be complete to say "in bytes" for all other properties, probably not necessary. --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #20269: [SPARK-23029] [DOCS] Specifying default units of ...
Github user srowen commented on a diff in the pull request: https://github.com/apache/spark/pull/20269#discussion_r161609127 --- Diff: docs/configuration.md --- @@ -58,6 +58,8 @@ The following format is accepted: 1t or 1tb (tebibytes = 1024 gibibytes) 1p or 1pb (pebibytes = 1024 tebibytes) +Without specification the unit depends on the configuration entry where KiB are typically assumed. --- End diff -- Just looking at the properties that use bytesConf(), there are as many in MiB. And, really the default is just bytes unless otherwise specified. If you say anything here, maybe just "While numbers without units are generally interpreted as bytes, a few are interpreted as KiB or MiB when no units are specified, for historical reasons. See documentation of individual configuration properties. Specifying units is desirable where possible." --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org
[GitHub] spark pull request #20269: [SPARK-23029] [DOCS] Specifying default units of ...
GitHub user ferdonline opened a pull request: https://github.com/apache/spark/pull/20269 [SPARK-23029] [DOCS] Specifying default units of configuration entries ## What changes were proposed in this pull request? This PR completes the docs, specifying the default units assumed in configuration entries of type size. This is crucial since unit-less values are accepted and the user might assume the base unit is bytes, which in most cases it is not, leading to hard-to-debug problems. ## How was this patch tested? This patch updates only documentation only. You can merge this pull request into a Git repository by running: $ git pull https://github.com/ferdonline/spark docs_units Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/20269.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #20269 commit 889426b191f1f542012e0a5c9f0a121f64a2b46e Author: Fernando Pereira Date: 2018-01-15T13:56:23Z Specifying default units of configuration entries --- - To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org