Re: Program crashed when volume of data getting large

2009-09-23 Thread Chandraprakash Bhagtani
your tasks are running out of memory you can increase memory by setting property *mapred.child.java.opts *-Xmx500m this means your tasks (map/reduce) can use maximum of 500 MB memory. default is 200m. increase it as much as you physical memory allows using swap space will make processing slow. On

RE: Program crashed when volume of data getting large

2009-09-23 Thread Amogh Vasekar
Hi, Please check the namenode heap usage. Your cluster may be having too many files to handle / too little free space. It is generally available in the UI. This is one of the causes I have seen for the Timeout. Amogh -Original Message- From: Kunsheng Chen [mailto:ke...@yahoo.com] Sent: