For Hadoop 0.20.203 (the latest stable), is it sufficient to do this to parse 
the lib jars from the command line?

public static main (String args[]) {
Configuration conf = new Configuration();
String[] otherArgs = new GenericOptionsParser(conf, args).getRemainingArgs();

Job job = new Job(conf, "makevector");
job.setJarByClass(MakeVector.class);
// etc other code for mappers/reducers
}

I'm thinking I am missing a step here which is why it won't load the mahout jar 
while running the map reduce.

On Jan 30, 2012, at 11:10 PM, Daniel Quach wrote:

> I compiled using javac:
> 
> javac -classpath 
> :/usr/local/hadoop/hadoop-core-0.20.203.0.jar:/usr/local/hadoop/lib/commons-cli-1.2.jar:/usr/local/mahout/math/target/mahout-math-0.6-SNAPSHOT.jar
>  -d makevector_classes/ MakeVector.java;
> 
> If I don't include the mahout-math jar, it gives me a compile error because 
> of DenseVector.
> 
> 
> On Jan 30, 2012, at 10:42 PM, Prashant Kommireddi wrote:
> 
>> How are you building the mapreduce jar? Try not to include the Mahout dist
>> while building MR jar, and include it only on "-libjars" option.
>> 
>> On Mon, Jan 30, 2012 at 10:33 PM, Daniel Quach <danqu...@cs.ucla.edu> wrote:
>> 
>>> I have been compiling my mapreduce with the jars in the classpath, and I
>>> believe I need to also add the jars as an option to -libjars to hadoop.
>>> However, even when I do this, I still get an error complaining about
>>> missing classes at runtime. (Compilation works fine).
>>> 
>>> Here is my command:
>>> hadoop jar makevector.jar org.myorg.MakeVector -libjars
>>> /usr/local/mahout/math/target/mahout-math-0.6-SNAPSHOT.jar input/ output/
>>> 
>>> This is the error I receive:
>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>> org/apache/mahout/math/DenseVector
>>> 
>>> I wonder if I am using the GenericOptionsParser incorrectly? I'm not sure
>>> if there is a deeper problem here.
>>> 
> 

Reply via email to