yes,i also occurred this problem,when i export HADOOP_HEAPSIZE=2000
,4000,6000,my Memory is 6GB

On Wed, Jun 24, 2009 at 4:59 PM, SunGod <[email protected]> wrote:

> Error occurred in  "crawldb TestDB/crawldb" reduce phase
>
> i get error msg --- java.lang.OutOfMemoryError: Java heap space
>
> my command
>  bin/nutch crawl url -dir TestDB -depth 4 -threads 3
>
>  single fetchlist around in 200000
>
> my settings on the memory
>
> hadoop-env.sh
> export HADOOP_HEAPSIZE=800
>
> hadoop-site.xml
> <property>
>  <name>mapred.tasktracker.map.tasks.maximum</name>
>  <value>4</value>
> </property>
> <property>
>  <name>mapred.tasktracker.reduce.tasks.maximum</name>
>  <value>4</value>
> </property>
> <property>
>  <name>mapred.map.tasks</name>
>  <value>2</value>
> </property>
> <property>
>  <name>mapred.reduce.tasks</name>
>  <value>2</value>
> </property>
> <property>
>  <name>mapred.map.max.attempts</name>
>  <value>4</value>
> </property>
> <property>
>  <name>mapred.reduce.max.attempts</name>
>  <value>4</value>
> </property>
> <property>
>  <name>mapred.child.java.opts</name>
>  <value>-Xmx250m</value>
> </property>
>

Reply via email to