Re: Memory allocation error with Spark 1.5, HashJoinCompatibilitySuite

2015-08-24 Thread Adam Roberts
Hi, I'm regularly hitting "Unable to acquire memory" problems only when trying to use overflow pages when running the full set of Spark tests across different platforms. The machines I'm using all have well over 10 GB of RAM and I'm running without any changes to the pom.xml file. Standard 3 GB Jav

Re: Memory allocation error with Spark 1.5

2015-08-06 Thread Alexis Seigneurin
Works like a charm. Thanks Reynold for the quick and efficient response! Alexis 2015-08-05 19:19 GMT+02:00 Reynold Xin : > In Spark 1.5, we have a new way to manage memory (part of Project > Tungsten). The default unit of memory allocation is 64MB, which is way too > high when you have 1G of mem

Re: Memory allocation error with Spark 1.5

2015-08-05 Thread Reynold Xin
In Spark 1.5, we have a new way to manage memory (part of Project Tungsten). The default unit of memory allocation is 64MB, which is way too high when you have 1G of memory allocated in total and have more than 4 threads. We will reduce the default page size before releasing 1.5. For now, you can

Memory allocation error with Spark 1.5

2015-08-05 Thread Alexis Seigneurin
Hi, I'm receiving a memory allocation error with a recent build of Spark 1.5: java.io.IOException: Unable to acquire 67108864 bytes of memory at org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.acquireNewPageIfNecessary(UnsafeExternalSorter.java:348) at org.apache.spark.util.coll