[ https://issues.apache.org/jira/browse/SPARK-18503?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15679064#comment-15679064 ]
Sean Owen commented on SPARK-18503: ----------------------------------- Yes, I don't think that's a problem per se. The various memory properties now default to interpreting this value as megabytes, and in any event I don't think we'd change the behavior in a minor release as it would be a behavior change. It's best to specify your units anyway for clarity. CC [~jerryshao] given https://github.com/apache/spark/pull/11603/files#diff-6bdad48cfc34314e89599655442ff210R38 > Pre 2.0 spark driver/executor memory default unit is bytes, post 2.0 default > unit is MB > --------------------------------------------------------------------------------------- > > Key: SPARK-18503 > URL: https://issues.apache.org/jira/browse/SPARK-18503 > Project: Spark > Issue Type: Bug > Components: Deploy > Affects Versions: 2.0.0, 2.0.1, 2.0.2 > Reporter: Chris McCubbin > Priority: Minor > > Prior to v2.0, Spark's executor memory default unit was in bytes. i.e. if one > set "spark.executor.memory" to "1000000" this was interpreted as 1,000,000 > bytes if no unit was supplied (like "1m"). This is consistent with the JVM. > Changes introduced by SPARK-12343 in 2.0 made this default unit MB, so > "1000000" is interpreted as 1,000,000 MB. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org