Start much smaller on the number of latent factors (topics).  Try 5, 10, and
then 20.

On Fri, Oct 22, 2010 at 5:35 PM, Sid <[email protected]> wrote:

> t the moment I employ a simple Hadoop with 2.00GB heap space per computer.
> 2 node cluster with 20 Map processes and 4 reducers, but I can change this.
> BTW what defines the upperlimit on the heapspace I cant go beyond 2GB
> hadoop
> says its above the valid limit.
>

Reply via email to