I guess you have looked at MemoryManager#pageSizeBytes where
the "spark.buffer.pageSize" config can override default page size.
FYI
On Mon, Mar 28, 2016 at 12:07 PM, Steve Johnston <
sjohns...@algebraixdata.com> wrote:
> I'm attempting to address an OOM issue. I saw referenced in
> java.lang.OutOfMemoryError: Unable to acquire bytes of memory
> <
> http://apache-spark-developers-list.1001551.n3.nabble.com/java-lang-OutOfMemoryError-Unable-to-acquire-bytes-of-memory-td16773.html
> >
> the configuration setting "spark.buffer.pageSize" which was used in
> conjunction with "spark.sql.shuffle.partitions" to solve the OOM problem
> Nezih was having.
>
> What is "spark.buffer.pageSize"? How can it be used? I can find it in the
> code but there doesn't seem to be any other documentation.
>
> Thanks,
> Steve
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/OOM-and-spark-buffer-pageSize-tp16890.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>
> -
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>