6.0
>>>
>>> Caused by:
>>> java.lang.OutOfMemoryError: Unable to acquire 28 bytes of memory, got 0
>>>
>>> Hope this helps
>>>
>>> Andy
>>>
>>> From: Michael Armbrust >> (mailto:mich...@databricks.com)
ubject: Re: trouble understanding data frame memory usage
³java.io.IOException: Unable to acquire memory²
Unfortunately in 1.5 we didn't force operators to spill when ran out of memory
so there is not a lot you can do. It would be awesome if you could test with
1.6 and see if things are any
28 bytes of memory, got 0
>
> Hope this helps
>
> Andy
>
> From: Michael Armbrust (mailto:mich...@databricks.com)>
> Date: Monday, December 28, 2015 at 2:41 PM
> To: Andrew Davidson (mailto:a...@santacruzintegration.com)>
> Cc: "user @spark"
nding data frame memory usage
³java.io.IOException: Unable to acquire memory²
> Unfortunately in 1.5 we didn't force operators to spill when ran out of memory
> so there is not a lot you can do. It would be awesome if you could test with
> 1.6 and see if things are any better?
>
>
version of reparation() and should be
> used when ever we know we are reducing the number of partitions.
>
> Kind regards
>
> Andy
>
> From: Michael Armbrust
> Date: Monday, December 28, 2015 at 2:41 PM
> To: Andrew Davidson
> Cc: "user @spark"
> Subje
reducing the number of partitions.
Kind regards
Andy
From: Michael Armbrust
Date: Monday, December 28, 2015 at 2:41 PM
To: Andrew Davidson
Cc: "user @spark"
Subject: Re: trouble understanding data frame memory usage
³java.io.IOException: Unable to acquire memory²
> Unfortunat
the 200 number looks strangely similar to the following default number of
post-shuffle partitions which is often left untuned:
spark.sql.shuffle.partitions
here's the property defined in the Spark source:
https://github.com/apache/spark/blob/834e71489bf560302f9d743dff669df1134e9b74/sql/core/sr
Unfortunately in 1.5 we didn't force operators to spill when ran out of
memory so there is not a lot you can do. It would be awesome if you could
test with 1.6 and see if things are any better?
On Mon, Dec 28, 2015 at 2:25 PM, Andy Davidson <
a...@santacruzintegration.com> wrote:
> I am using sp
I am using spark 1.5.1. I am running into some memory problems with a java
unit test. Yes I could fix it by setting Xmx (its set to 1024M) how ever I
want to better understand what is going on so I can write better code in the
future. The test runs on a Mac, master="Local[2]"
I have a java unit t