Hi All, I posted up a Mahout canopy generation related troubleshoot
last week; however, I didn't get the problem solved. The message below
is the error I received. I'm trying to run canopy generation about 900
mb worth of information. There are estimated about 120,000 vectors.
I'm currently runnin
Hi all, I have been experiencing memory issue while working with Mahout
canopy algorithm on big set of data on Hadoop. I notice that only one
reducer was running while other nodes were idle. I was wondering if
increasing the number of reduce tasks would ease down the memory usage and
speed up proc
unrelated to those. They
> run in their own JVMs each.
>
> Kai
>
> Am 25.11.2013 um 15:52 schrieb Chih-Hsien Wu :
>
> I'm learning about Hadoop configuration. What is the connection between
> the datanode/ tasktracker heap sizes and the "mapre.c
I'm learning about Hadoop configuration. What is the connection between the
datanode/ tasktracker heap sizes and the "mapre.child.java.opts"? Does one
have to be exceeded to another?
I uploaded data into distributed file system. Cluster summary shows there
is enough heap size memory. However, whenever I try run Mahout 0.8 command.
The system displays out of heap memory exception. I shutdown hadoop cluster
and allocated more memory to mapred.child.java.opts. I then restarted the