Github user liyinan926 commented on a diff in the pull request: https://github.com/apache/spark/pull/19717#discussion_r155908117 --- Diff: docs/configuration.md --- @@ -157,13 +157,31 @@ of the most common options to set are: or in your default properties file. </td> </tr> +<tr> + <td><code>spark.driver.memoryOverhead</code></td> + <td>driverMemory * 0.10, with minimum of 384 </td> + <td> + The amount of off-heap memory (in megabytes) to be allocated per driver in cluster mode. This is + memory that accounts for things like VM overheads, interned strings, other native overheads, etc. + This tends to grow with the container size (typically 6-10%). --- End diff -- Done.
--- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org