OutofMemory Error, inspite of large amounts provided

2008-12-28 Thread Saptarshi Guha
Hello, I have work machines with 32GB and allocated 16GB to the heap size ==hadoop-env.sh== export HADOOP_HEAPSIZE=16384 ==hadoop-site.xml== property namemapred.child.java.opts/name value-Xmx16384m/value /property The same code runs when not being run through Hadoop, but it fails when in a

Re: OutofMemory Error, inspite of large amounts provided

2008-12-28 Thread Brian Bockelman
Hey Saptarshi, Watch the running child process while using ps, top, or Ganglia monitoring. Does the map task actually use 16GB of memory, or is the memory not getting set properly? Brian On Dec 28, 2008, at 3:00 PM, Saptarshi Guha wrote: Hello, I have work machines with 32GB and

Re: OutofMemory Error, inspite of large amounts provided

2008-12-28 Thread Saptarshi Guha
On Sun, Dec 28, 2008 at 4:33 PM, Brian Bockelman bbock...@cse.unl.edu wrote: Hey Saptarshi, Watch the running child process while using ps, top, or Ganglia monitoring. Does the map task actually use 16GB of memory, or is the memory not getting set properly? Brian I haven't figured out how

Re: OutofMemory Error, inspite of large amounts provided

2008-12-28 Thread Saptarshi Guha
Caught it in action. Running ps -e -o 'vsz pid ruser args' |sort -nr|head -5 on a machine where the map task was running 04812 16962 sguha/home/godhuli/custom/jdk1.6.0_11/jre/bin/java

Re: OutofMemory Error, inspite of large amounts provided

2008-12-28 Thread Amareshwari Sriramadasu
Saptarshi Guha wrote: Caught it in action. Running ps -e -o 'vsz pid ruser args' |sort -nr|head -5 on a machine where the map task was running 04812 16962 sguha/home/godhuli/custom/jdk1.6.0_11/jre/bin/java