Works like a charm. Thanks Reynold for the quick and efficient response!
Alexis
2015-08-05 19:19 GMT+02:00 Reynold Xin r...@databricks.com:
In Spark 1.5, we have a new way to manage memory (part of Project
Tungsten). The default unit of memory allocation is 64MB, which is way too
high when
Hi,
I'm receiving a memory allocation error with a recent build of Spark 1.5:
java.io.IOException: Unable to acquire 67108864 bytes of memory
at
org.apache.spark.util.collection.unsafe.sort.UnsafeExternalSorter.acquireNewPageIfNecessary(UnsafeExternalSorter.java:348)
at
In Spark 1.5, we have a new way to manage memory (part of Project
Tungsten). The default unit of memory allocation is 64MB, which is way too
high when you have 1G of memory allocated in total and have more than 4
threads.
We will reduce the default page size before releasing 1.5. For now, you