How much RAM do you have?
A good rule of thumb is to use 1-1.5G for maps and 2G per reduce
(vmem). Ensure your OS has at least 2G of memory.
Thus, with 24G and dual quad cores you should be at 8-10m/2r. Scale up
if you have more memory.
Also, ensure you turn on memory monitoring so that a rouge
Hi All,
We have been using dual socket quad core machines for a while and have
been running with 8 mappers 2 reducers. The rule of thumb we heard was
slightly oversubscribe the number of cores but at Hadoop World several
people said other things.
Some new machines we are moving to have 2 socket 6
I cranked those setting up in an attempt to solve the heap issues. Just to
verify, I restored the defaults and cycled both dfs and mapred daemons.
Still getting same error.
On 11/13/11 6:34 PM, "Eric Fiala" wrote:
> Hoot, these are big numbers - some thoughts
> 1) does your machine have 1000GB
Hoot, these are big numbers - some thoughts
1) does your machine have 1000GB to spare for each java child thread (each
mapper + each reducer)? mapred.child.java.opts / -Xmx1048576m
2) does each of your daemons need / have 10G? HADOOP_HEAPSIZE=1
hth
EF
> # The maximum amount of heap to use, i