[ 
https://issues.apache.org/jira/browse/MAHOUT-356?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=12852455#action_12852455
 ] 

Sean Owen commented on MAHOUT-356:
----------------------------------

You're using the latest code, which is somewhere past 0.3, and will eventually 
become 0.4. But that's completely fine.

Sounds like you are doing it right (down to adjusting your command line args 
for the latest changes, hours ago). Did it work before? because then I really 
suspect this "setJarByClass()" stuff that changed for the issue.

I'll need to look into why that class isn't showing up, otherwise.

> ClassNotFoundException: org.apache.mahout.math.function.IntDoubleProcedure
> --------------------------------------------------------------------------
>
>                 Key: MAHOUT-356
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-356
>             Project: Mahout
>          Issue Type: Bug
>          Components: Collaborative Filtering
>    Affects Versions: 0.3
>         Environment: karmic ubuntu 9.10, java version "1.6.0_15" hadoop 0.20
>            Reporter: Kris Jack
>             Fix For: 0.3
>
>
> When running org.apache.mahout.cf.taste.hadoop.item.RecommenderJob in a 
> pseudo-distributed hadoop, I get a java class not found exception.
> Full Output:
> 10/04/01 16:50:42 INFO mapred.FileInputFormat: Total input paths to process : 
> 1
> 10/04/01 16:50:43 INFO mapred.JobClient: Running job: job_201004011631_0005
> 10/04/01 16:50:44 INFO mapred.JobClient:  map 0% reduce 0%
> 10/04/01 16:50:55 INFO mapred.JobClient:  map 2% reduce 0%
> 10/04/01 16:50:58 INFO mapred.JobClient:  map 14% reduce 0%
> 10/04/01 16:51:01 INFO mapred.JobClient:  map 24% reduce 0%
> 10/04/01 16:51:04 INFO mapred.JobClient:  map 33% reduce 0%
> 10/04/01 16:51:07 INFO mapred.JobClient:  map 41% reduce 0%
> 10/04/01 16:51:10 INFO mapred.JobClient:  map 50% reduce 0%
> 10/04/01 16:51:23 INFO mapred.JobClient:  map 63% reduce 0%
> 10/04/01 16:51:26 INFO mapred.JobClient:  map 72% reduce 16%
> 10/04/01 16:51:29 INFO mapred.JobClient:  map 83% reduce 16%
> 10/04/01 16:51:32 INFO mapred.JobClient:  map 92% reduce 16%
> 10/04/01 16:51:35 INFO mapred.JobClient:  map 98% reduce 16%
> 10/04/01 16:51:38 INFO mapred.JobClient:  map 100% reduce 16%
> 10/04/01 16:51:41 INFO mapred.JobClient:  map 100% reduce 25%
> 10/04/01 16:51:59 INFO mapred.JobClient:  map 100% reduce 100%
> 10/04/01 16:52:01 INFO mapred.JobClient: Job complete: job_201004011631_0005
> 10/04/01 16:52:01 INFO mapred.JobClient: Counters: 18
> 10/04/01 16:52:01 INFO mapred.JobClient:   Job Counters 
> 10/04/01 16:52:01 INFO mapred.JobClient:     Launched reduce tasks=1
> 10/04/01 16:52:01 INFO mapred.JobClient:     Launched map tasks=4
> 10/04/01 16:52:01 INFO mapred.JobClient:     Data-local map tasks=4
> 10/04/01 16:52:01 INFO mapred.JobClient:   FileSystemCounters
> 10/04/01 16:52:01 INFO mapred.JobClient:     FILE_BYTES_READ=603502320
> 10/04/01 16:52:01 INFO mapred.JobClient:     HDFS_BYTES_READ=257007616
> 10/04/01 16:52:01 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=846533316
> 10/04/01 16:52:01 INFO mapred.JobClient:     HDFS_BYTES_WRITTEN=3417233
> 10/04/01 16:52:01 INFO mapred.JobClient:   Map-Reduce Framework
> 10/04/01 16:52:01 INFO mapred.JobClient:     Reduce input groups=168791
> 10/04/01 16:52:01 INFO mapred.JobClient:     Combine output records=0
> 10/04/01 16:52:01 INFO mapred.JobClient:     Map input records=17359346
> 10/04/01 16:52:01 INFO mapred.JobClient:     Reduce shuffle bytes=179672560
> 10/04/01 16:52:01 INFO mapred.JobClient:     Reduce output records=168791
> 10/04/01 16:52:01 INFO mapred.JobClient:     Spilled Records=60466622
> 10/04/01 16:52:01 INFO mapred.JobClient:     Map output bytes=208312152
> 10/04/01 16:52:01 INFO mapred.JobClient:     Map input bytes=256995325
> 10/04/01 16:52:01 INFO mapred.JobClient:     Combine input records=0
> 10/04/01 16:52:01 INFO mapred.JobClient:     Map output records=17359346
> 10/04/01 16:52:01 INFO mapred.JobClient:     Reduce input records=17359346
> 10/04/01 16:52:01 INFO mapred.FileInputFormat: Total input paths to process : 
> 1
> 10/04/01 16:52:01 INFO mapred.JobClient: Running job: job_201004011631_0006
> 10/04/01 16:52:02 INFO mapred.JobClient:  map 0% reduce 0%
> 10/04/01 16:52:17 INFO mapred.JobClient:  map 15% reduce 0%
> 10/04/01 16:52:20 INFO mapred.JobClient:  map 25% reduce 0%
> 10/04/01 16:52:23 INFO mapred.JobClient:  map 34% reduce 0%
> 10/04/01 16:52:26 INFO mapred.JobClient:  map 45% reduce 0%
> 10/04/01 16:52:29 INFO mapred.JobClient:  map 50% reduce 0%
> 10/04/01 16:52:41 INFO mapred.JobClient:  map 62% reduce 0%
> 10/04/01 16:52:44 INFO mapred.JobClient:  map 70% reduce 16%
> 10/04/01 16:52:48 INFO mapred.JobClient:  map 81% reduce 16%
> 10/04/01 16:52:51 INFO mapred.JobClient:  map 91% reduce 16%
> 10/04/01 16:52:53 INFO mapred.JobClient:  map 96% reduce 16%
> 10/04/01 16:52:56 INFO mapred.JobClient:  map 100% reduce 16%
> 10/04/01 16:53:02 INFO mapred.JobClient:  map 100% reduce 25%
> 10/04/01 16:53:05 INFO mapred.JobClient:  map 100% reduce 0%
> 10/04/01 16:53:07 INFO mapred.JobClient: Task Id : 
> attempt_201004011631_0006_r_000000_0, Status : FAILED
> Error: java.lang.ClassNotFoundException: 
> org.apache.mahout.math.function.IntDoubleProcedure
>       at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
>       at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
>       at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
>       at 
> org.apache.mahout.cf.taste.hadoop.item.ToUserVectorReducer.reduce(ToUserVectorReducer.java:71)
>       at 
> org.apache.mahout.cf.taste.hadoop.item.ToUserVectorReducer.reduce(ToUserVectorReducer.java:58)
>       at 
> org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:463)
>       at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:411)
>       at org.apache.hadoop.mapred.Child.main(Child.java:170)
> 10/04/01 16:53:22 INFO mapred.JobClient: Task Id : 
> attempt_201004011631_0006_r_000000_1, Status : FAILED
> Error: java.lang.ClassNotFoundException: 
> org.apache.mahout.math.function.IntDoubleProcedure
>       at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
>       at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
>       at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
>       at 
> org.apache.mahout.cf.taste.hadoop.item.ToUserVectorReducer.reduce(ToUserVectorReducer.java:71)
>       at 
> org.apache.mahout.cf.taste.hadoop.item.ToUserVectorReducer.reduce(ToUserVectorReducer.java:58)
>       at 
> org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:463)
>       at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:411)
>       at org.apache.hadoop.mapred.Child.main(Child.java:170)
> 10/04/01 16:53:33 INFO mapred.JobClient:  map 100% reduce 8%
> 10/04/01 16:53:36 INFO mapred.JobClient:  map 100% reduce 0%
> 10/04/01 16:53:38 INFO mapred.JobClient: Task Id : 
> attempt_201004011631_0006_r_000000_2, Status : FAILED
> Error: java.lang.ClassNotFoundException: 
> org.apache.mahout.math.function.IntDoubleProcedure
>       at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
>       at java.security.AccessController.doPrivileged(Native Method)
>       at java.net.URLClassLoader.findClass(URLClassLoader.java:188)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:307)
>       at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
>       at java.lang.ClassLoader.loadClass(ClassLoader.java:252)
>       at java.lang.ClassLoader.loadClassInternal(ClassLoader.java:320)
>       at 
> org.apache.mahout.cf.taste.hadoop.item.ToUserVectorReducer.reduce(ToUserVectorReducer.java:71)
>       at 
> org.apache.mahout.cf.taste.hadoop.item.ToUserVectorReducer.reduce(ToUserVectorReducer.java:58)
>       at 
> org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:463)
>       at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:411)
>       at org.apache.hadoop.mapred.Child.main(Child.java:170)
> 10/04/01 16:53:53 INFO mapred.JobClient: Job complete: job_201004011631_0006
> 10/04/01 16:53:53 INFO mapred.JobClient: Counters: 14
> 10/04/01 16:53:53 INFO mapred.JobClient:   Job Counters 
> 10/04/01 16:53:53 INFO mapred.JobClient:     Launched reduce tasks=4
> 10/04/01 16:53:53 INFO mapred.JobClient:     Launched map tasks=4
> 10/04/01 16:53:53 INFO mapred.JobClient:     Data-local map tasks=4
> 10/04/01 16:53:53 INFO mapred.JobClient:     Failed reduce tasks=1
> 10/04/01 16:53:53 INFO mapred.JobClient:   FileSystemCounters
> 10/04/01 16:53:53 INFO mapred.JobClient:     FILE_BYTES_READ=566454892
> 10/04/01 16:53:53 INFO mapred.JobClient:     HDFS_BYTES_READ=257007616
> 10/04/01 16:53:53 INFO mapred.JobClient:     FILE_BYTES_WRITTEN=948360656
> 10/04/01 16:53:53 INFO mapred.JobClient:   Map-Reduce Framework
> 10/04/01 16:53:53 INFO mapred.JobClient:     Combine output records=0
> 10/04/01 16:53:53 INFO mapred.JobClient:     Map input records=17359346
> 10/04/01 16:53:53 INFO mapred.JobClient:     Spilled Records=43107276
> 10/04/01 16:53:53 INFO mapred.JobClient:     Map output bytes=347186920
> 10/04/01 16:53:53 INFO mapred.JobClient:     Map input bytes=256995325
> 10/04/01 16:53:53 INFO mapred.JobClient:     Combine input records=0
> 10/04/01 16:53:53 INFO mapred.JobClient:     Map output records=17359346
> Exception in thread "main" java.io.IOException: Job failed!
>       at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1293)
>       at 
> org.apache.mahout.cf.taste.hadoop.item.RecommenderJob.run(RecommenderJob.java:92)
>       at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>       at 
> org.apache.mahout.cf.taste.hadoop.item.RecommenderJob.main(RecommenderJob.java:116)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>       at java.lang.reflect.Method.invoke(Method.java:597)
>       at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
> Is it a configuration problem on my side or a problem with the code?
> Many thanks,
> Kris

-- 
This message is automatically generated by JIRA.
-
You can reply to this email to add a comment to the issue online.

Reply via email to