Attila Zsolt Piros created SPARK-32293:
------------------------------------------

             Summary: Inconsistent default units for configuring Spark memory
                 Key: SPARK-32293
                 URL: https://issues.apache.org/jira/browse/SPARK-32293
             Project: Spark
          Issue Type: Bug
          Components: Documentation, Spark Core
    Affects Versions: 3.0.0, 2.4.6, 2.4.5, 2.4.4, 2.4.3, 2.4.2, 2.4.1, 2.4.0, 
2.3.4, 2.3.3, 2.3.2, 2.3.1, 2.3.0, 2.2.3, 2.2.2, 2.2.1, 3.0.1, 3.1.0
            Reporter: Attila Zsolt Piros


Spark's maximum memory can be configured in several ways:
- via Spark config
- command line argument
- environment variables 

Both for executors and for the driver the memory can be configured separately. 
All of these are following the format of JVM memory configurations in a way 
they are using the very same size unit suffixes ("k", "m", "g" or "t") but 
there is an inconsistency regarding the default unit. When no suffix is given 
then the given amount is passed as it is to the JVM (to the -Xmx and -Xms 
options) where this memory options are using bytes as a default unit, for this 
please see the example 
[here|https://docs.oracle.com/javase/8/docs/technotes/tools/windows/java.html]:

{noformat}
The following examples show how to set the maximum allowed size of allocated 
memory to 80 MB using various units:

-Xmx83886080 
-Xmx81920k 
-Xmx80m
{noformat}

Although the Spark memory config default suffix unit is "m".



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to