Hi,
It would be more helpful if you provide the exact error here.
Also, hadoop uses the local FS to store intermediate data, along with HDFS for
final output.
If your job is memory intensive, try limiting the number of tasks you are
running in parallel on a machine.
Amogh
On 10/19/09 8:27 AM,
You might be hitting into the problem of small-files. This has been
discussed multiple times on the list. Greping through archives will help.
Also http://www.cloudera.com/blog/2009/02/02/the-small-files-problem/
Ashutosh
On Sun, Oct 18, 2009 at 22:57, Kunsheng Chen ke...@yahoo.com wrote:
I and
: Hadoop dfs can't allocate memory with enough hard disk space
when data gets huge
To: common-user@hadoop.apache.org
Date: Monday, October 19, 2009, 3:30 PM
You might be hitting into the problem
of small-files. This has been
discussed multiple times on the list. Greping through
archives
I and running a hadoop program to perform MapReduce work on files inside a
folder.
My program is basically doing Map and Reduce work, each line of any file is a
pair of string, and the result is a string associate with occurence inside all
files.
The program works fine until the number of