Re: Hadoop dfs can't allocate memory with enough hard disk space when data gets huge

2009-10-19 Thread Amogh Vasekar
Hi, It would be more helpful if you provide the exact error here. Also, hadoop uses the local FS to store intermediate data, along with HDFS for final output. If your job is memory intensive, try limiting the number of tasks you are running in parallel on a machine. Amogh On 10/19/09 8:27 AM,

Re: Hadoop dfs can't allocate memory with enough hard disk space when data gets huge

2009-10-19 Thread Ashutosh Chauhan
You might be hitting into the problem of small-files. This has been discussed multiple times on the list. Greping through archives will help. Also http://www.cloudera.com/blog/2009/02/02/the-small-files-problem/ Ashutosh On Sun, Oct 18, 2009 at 22:57, Kunsheng Chen ke...@yahoo.com wrote: I and

Re: Hadoop dfs can't allocate memory with enough hard disk space when data gets huge

2009-10-19 Thread Dmitriy Ryaboy
: Hadoop dfs can't allocate memory with enough hard disk space when  data gets huge To: common-user@hadoop.apache.org Date: Monday, October 19, 2009, 3:30 PM You might be hitting into the problem of small-files. This has been discussed multiple times on the list. Greping through archives

Hadoop dfs can't allocate memory with enough hard disk space when data gets huge

2009-10-18 Thread Kunsheng Chen
I and running a hadoop program to perform MapReduce work on files inside a folder. My program is basically doing Map and Reduce work, each line of any file is a pair of string, and the result is a string associate with occurence inside all files. The program works fine until the number of