[ 
https://issues.apache.org/jira/browse/SPARK-14367?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15235623#comment-15235623
 ] 

Tom Hubregtsen commented on SPARK-14367:
----------------------------------------

Sorry, let me be more precise. 

I can now get the default memory behavior and settings of Spark 1.3 (just 
calling spark-submit with no parameters) with the explicit settings for Spark 
1.6 (so using the useLegacyMode together with setting the driver memory to 
512m).

My follow up questions was with regard to the default values for the previously 
mentioned settings. I found the following using [1] and [2]:
spark.[shuffle/storage].memoryFraction and spark.storage.unrollFraction 
remained the same default value between Spark 1.3 to Spark 1.6
--driver-memory doubled from Spark 1.4 to 1.5

With this in mind, useLegacyModel works as is expected: The defaults for Spark 
1.6 are the same as Spark 1.5 when using useLegacyMode. I just did not realized 
they changed from 1.3 to 1.5.

Sources: 
[1] http://spark.apache.org/docs/1.3.0/configuration.html
[2] http://spark.apache.org/docs/1.6.1/configuration.html

> spark.memory.useLegacyMode=true in 1.6 does not yield the same memory 
> behavior as Spark 1.3
> -------------------------------------------------------------------------------------------
>
>                 Key: SPARK-14367
>                 URL: https://issues.apache.org/jira/browse/SPARK-14367
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.6.0
>         Environment: Ubuntu 15.10 with ibm-java-ppc64le-80. 
>            Reporter: Tom Hubregtsen
>            Priority: Minor
>              Labels: backwards-compatibility
>
>  Hi,
> I am trying to get the same memory behavior in Spark 1.6 as I had in Spark 
> 1.3 with default settings.
> I set
> --driver-java-options "--Dspark.memory.useLegacyMode=true 
> -Dspark.shuffle.memoryFraction=0.2 -Dspark.storage.memoryFraction=0.6 
> -Dspark.storage.unrollFraction=0.2"
> in Spark 1.6.
> But the numbers don't add up. For instance:
> --driver-java-options "-Dspark.shuffle.memoryFraction=0.1 
> -Dspark.storage.memoryFraction=0.1"
> in Spark 1.3.1 leads to:
> 16/03/29 14:47:36 INFO MemoryStore: MemoryStore started with capacity 46.1 MB
> The same in Spark 1.6.0 with -Dspark.memory.useLegacyMode=true 
> -Dspark.shuffle.memoryFraction=0.1 -Dspark.storage.memoryFraction=0.1.
> 16/03/29 14:50:55 INFO MemoryStore: MemoryStore started with capacity 92.2 MB
> If I then increase both fractions to 0.2, the numbers of the MemoryStore both 
> double (as one would expect), but that means there is still a 2x difference 
> in allocated memory between Spark 1.3 and Spark 1.6. So my question:
> I believe a parameter that reads
> spark.memory.useLegacyMode=true
> should yield the *exact* memory behavior as in the Legacy version. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to