Hi,
I’m using Spark 1.1.0 and I’m having some issues to setup memory options.
I get “Requested array size exceeds VM limit” and I’m probably missing
something regarding memory configuration
(https://spark.apache.org/docs/1.1.0/configuration.html).
My server has 30G of memory and this are my
Hi Akhil,
thanks for your help
but I was originally running without xmx option. With that I was just
trying to push the limit of my heap size, but obviously doing it wrong.
Arian Pasquali
http://about.me/arianpasquali
2014-10-20 12:24 GMT+01:00 Akhil Das ak...@sigmoidanalytics.com:
Hi
That's true Guillaume.
I'm currently aggregating documents considering a week as time range.
I will have to make it daily and aggregate the results later.
thanks for your hints anyway
Arian Pasquali
http://about.me/arianpasquali
2014-10-20 13:53 GMT+01:00 Guillaume Pitel guillaume.pi
Interesting thread Marius,
Btw, I'm curious about your cluster size.
How small it is in terms of ram and cores.
Arian
2014-10-22 13:17 GMT+01:00 Nicholas Chammas nicholas.cham...@gmail.com:
Total guess without knowing anything about your code: Do either of these
two notes from the 1.1.0