Re: Mapreduce heap size error

2011-11-15 Thread Hoot Thompson
t; INRIA-TELECOM PARISTECH - ENPC School of International Management > > Office: 11-15 > Phone: (33)-1 39 63 59 33 > Fax: (33)-1 39 63 56 74 > Email: riadh.t...@inria.fr <http://inria.fr/> > Home page: http://www-rocq.inria.fr/who/Mohamed.Trad/ > <http://www-rocq.inria.

Re: Mapreduce heap size error

2011-11-15 Thread Hoot Thompson
> > Office: 11-15 > Phone: (33)-1 39 63 59 33 > Fax: (33)-1 39 63 56 74 > Email: riadh.t...@inria.fr <http://inria.fr/> > Home page: http://www-rocq.inria.fr/who/Mohamed.Trad/ > <http://www-rocq.inria.fr/%7Etrad/> > > > > > Le 15 nov. 2011 à 00:39,

Re: Mapreduce heap size error

2011-11-14 Thread Hoot Thompson
Still issues, around 2300 unique files hadoop@lobster-nfs:~/querry$ hadoop jar HadoopTest.jar -D mapred.child.java.opts=-Xmx4096M hdfs://lobster-nfs:9000/hadoop_fs/dfs/merra/seq_out /hadoop_fs/dfs/output/test_14_r2.out 11/11/15 01:56:20 INFO hpc.Driver: Jar Name: /home/hadoop/querry/Hadoo

Re: Mapreduce heap size error

2011-11-14 Thread Hoot Thompson
Any suggestions as to how to track down the root cause of these errors? 1178709 [main] INFO org.apache.hadoop.mapred.JobClient - map 6% reduce 0% 1178709 [main] INFO org.apache.hadoop.mapred.JobClient - map 6% reduce 0% 11/11/15 00:45:29 INFO mapred.JobClient: Task Id : attempt_20150008_00

Re: Mapreduce heap size error

2011-11-13 Thread Hoot Thompson
I cranked those setting up in an attempt to solve the heap issues. Just to verify, I restored the defaults and cycled both dfs and mapred daemons. Still getting same error. On 11/13/11 6:34 PM, "Eric Fiala" wrote: > Hoot, these are big numbers - some thoughts > 1) does your machine have 1000GB

Mapreduce heap size error

2011-11-12 Thread Hoot Thompson
>> >>> Can¹t seem to get past this heap size error, any ideas where to look? Below >>> are my heap size settings, at least the ones I attempted to increase. >>> >>> Thanks in advance for any thoughts. >>> >>> >>> # The maximum amount of heap to use, in MB. Default is 1000. >>> export HADOOP_