Pat McDonough created SPARK-1392: ------------------------------------ Summary: Local spark-shell Runs Out of Memory With Default Settings Key: SPARK-1392 URL: https://issues.apache.org/jira/browse/SPARK-1392 Project: Spark Issue Type: Bug Components: Spark Core Affects Versions: 0.9.0 Environment: OS X 10.9.2, Java 1.7.0_51, Scala 2.10.3 Reporter: Pat McDonough
When running the spark-shell locally in out of the box configuration, and attempting to cache all the attached data, spark OOMs with: {{java.lang.OutOfMemoryError: GC overhead limit exceeded}} {code} val explore = sc.textFile("/Users/pat/Projects/training-materials/Data/wiki_links") explore.cache explore.count {code} You can work around the issue by either decreasing {{spark.storage.memoryFraction}} or increasing {{SPARK_MEM}} -- This message was sent by Atlassian JIRA (v6.2#6252)