Yuan Jianhui created SPARK-5073: ----------------------------------- Summary: "spark.storage.memoryMapThreshold" have two default value Key: SPARK-5073 URL: https://issues.apache.org/jira/browse/SPARK-5073 Project: Spark Issue Type: Bug Components: Spark Core Affects Versions: 1.2.0 Reporter: Yuan Jianhui Priority: Minor
In org.apache.spark.storage.DiskStore: val minMemoryMapBytes = blockManager.conf.getLong("spark.storage.memoryMapThreshold", 2 * 4096L) In org.apache.spark.network.util.TransportConf: public int memoryMapBytes() { return conf.getInt("spark.storage.memoryMapThreshold", 2 * 1024 * 1024); } -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org