Hi,
We are using Hadoop 0.19 on a small cluster of 7 machines (7 datanodes, 4
task trackers), and we typically have 3-4 jobs running at a time. We have
been facing the following error on the Jobtracker:
java.io.IOException: java.lang.OutOfMemoryError: GC overhead limit exceeded
It seems to be
Meghana,
What is the heapsize for your JT? Try increasing that.
Also, we've fixed a huge number of issues in the JT (and Hadoop overall) since
0.19. Can you upgrade to 0.20.203, the latest stable release?
thanks,
Arun
Sent from my iPhone
On Jul 4, 2011, at 11:10 PM, Meghana
Hey Arun,
The JT heapsize (Xmx) is 512m. Will try increasing it, thanks!
Yes, migrating to 0.20 is definitely on my to-do list, but some urgent
issues have taken priority for now :(
Thanks,
..meghana
On 5 July 2011 12:25, Arun C Murthy a...@hortonworks.com wrote:
Meghana,
What is the