[ https://issues.apache.org/jira/browse/SPARK-5073?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14264554#comment-14264554 ]
Apache Spark commented on SPARK-5073: ------------------------------------- User 'Lewuathe' has created a pull request for this issue: https://github.com/apache/spark/pull/3900 > "spark.storage.memoryMapThreshold" have two default value > --------------------------------------------------------- > > Key: SPARK-5073 > URL: https://issues.apache.org/jira/browse/SPARK-5073 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 1.2.0 > Reporter: Yuan Jianhui > Priority: Minor > > In org.apache.spark.storage.DiskStore: > val minMemoryMapBytes = > blockManager.conf.getLong("spark.storage.memoryMapThreshold", 2 * 4096L) > In org.apache.spark.network.util.TransportConf: > public int memoryMapBytes() { > return conf.getInt("spark.storage.memoryMapThreshold", 2 * 1024 * > 1024); > } -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org