[ https://issues.apache.org/jira/browse/SPARK-5073?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16131667#comment-16131667 ]
zhaP524 commented on SPARK-5073: -------------------------------- I wonder what this parameter is for?Also want to know if this parameter is related to the memory-mapped file?I hope You can tell you the trouble. > "spark.storage.memoryMapThreshold" has two default values > --------------------------------------------------------- > > Key: SPARK-5073 > URL: https://issues.apache.org/jira/browse/SPARK-5073 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 1.2.0 > Reporter: Jianhui Yuan > Priority: Minor > > In org.apache.spark.storage.DiskStore: > val minMemoryMapBytes = > blockManager.conf.getLong("spark.storage.memoryMapThreshold", 2 * 4096L) > In org.apache.spark.network.util.TransportConf: > public int memoryMapBytes() { > return conf.getInt("spark.storage.memoryMapThreshold", 2 * 1024 * > 1024); > } -- This message was sent by Atlassian JIRA (v6.4.14#64029) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org