Hi Lokendra,
Great point and it turned out to the case. We had synced the path (or
thought we had), however it seems it didn't take.
Thanks again, it looks like things are working as expected at this point.
Cheers,
Chris
On Thu, Feb 3, 2011 at 1:21 PM, Lokendra Singh wrote:
> Hi,
>
> If you are
Hi,
If you are mainly facing problems with ClassNotFound in Hadoop Environment,
I would suggest you to put all the mahout jars in HADOOP_CLASSPATH in
'$HADOOP_HOME/conf/hadoop-env.sh'. Also, while running the MR job, make sure
that $HADOOP_HOME/conf exists in your classpath.
Regards
Lokendra
On
Hi Tim, Jeff,
First, sorry for starting a new thread, apparently our proxy will not
let the listing replies come through.
In any event, to answer both of you:
Jeff - you are correct, we did not utilize the core-job jar, and
however we add all the JAR dependencies (util, math, commons,
collections.
Hi Chris,
If I'm reading your message correctly, it sounds like you are trying to pass
sequence files as input to the clustering job. The clustering jobs require
vectors as input, not just sequence files. So make sure you are pointing to
the output of seq2sparse, which would be something like: pat
Sounds like you might not be using the mahout-core-0.4-job.jar file? Also, we
don't run on Hadoop 0.20.1, only 20.2. Finally, trunk always has the latest and
greatest patches in it and the clustering stuff is quite stable there.
Jeff
-Original Message-
From: McConnell, Christopher (GE G