I am looking to test hadoop 0.23 or CDH4 beta on my local VM. I am looking
to execute the sample example codes in new architecture, play around with
the containers/resource managers.
Is there any pre-requisite on default memory/CPU/core settings I need to
keep in mind before setting up the VM.
Reg
Praveenesh,
Speaking minimally (and thereby requiring less tweaks on your end),
1.5 GB would be a good value to use for RAM if available (1.0 will do
too, if you make sure to tweak your configs to not use too much heap
memory). Single processor should do fine for testing purposes.
On Tue, Apr 17,
Hi,
Sweet.. Can you please elaborate how can I tweak my configs to make
CDH4/hadoop-0.23 run in 1.5GB RAM VM.
Regards,
Praveenesh
On Wed, Apr 18, 2012 at 8:42 AM, Harsh J wrote:
> Praveenesh,
>
> Speaking minimally (and thereby requiring less tweaks on your end),
> 1.5 GB would be a good value
You are better of trying hadoop-0.23.1 or even hadoop-0.23.2-rc0 since CDH4's
version of YARN is very incomplete and you might get nasty surprises there.
Settings:
# Run 1 NodeManager with yarn.nodemanager.resource.memory-mb -> 1024
# Use CapacityScheduler (significantly better tested) by settin
Thanks Arun,
Will try out those settings. Is there a good documentation on
"configuring/playing with hadoop 0.23" apart from the apache hadoop-0.23
page. I have already looked into that page.. Just wondering is there
something more that I don't know.
Regards,
Praveenesh
On Fri, Apr 20, 2012 at 12