gt;>>>>> .reflect
>>>>>>>>>>>>> .DelegatingMethodAccessorImpl
>>>>>>>>>>>>> .invoke(DelegatingMethodAccessorImpl.java:25)
>>>>>>>>>>>>> at java.lang.reflect.Method.invok
http://www.lucidimagination.com/search
--
Grant Ingersoll
http://www.lucidimagination.com/
Search the Lucene ecosystem (Lucene/Solr/Nutch/Mahout/Tika/Droids)
using Solr/Lucene:
http://www.lucidimagination.com/search
--
View this messa
t;>>>>>>>>> at
>>>>>>>>>>> org
>>>>>>>>>>> .apache
>>>>>>>>>>> .mahout
>>>>>>>>>>> .clustering
>>>>>>>>>&g
ntheticcontrol.dirichlet.NormalScModelDistribution
>>>>>>>>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:200)
>>>>>>>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>>>>>>>> at j
I've flattened the JOB with all classes in the same JAR and that works
successfully.
Steps:
1) svn co http://svn.apache.org/repos/asf/lucene/mahout/trunk mahout-
trunk
2) cd mahout-trunk
3) mvn install
4) hadoop jar examples/target/mahout-examples-0.2-SNAPSHOT.job
org.apache.mahout.cluster
Have you tried flattening the JOB so all the classes are packed in a
single JAR? Also, can you give the full list of steps you are doing,
because I am able to run this in pseudo-distro without getting this
error. Also, have you checked the Hadoop logs ($HADOOP/logs, I believe)
I also noti
I've tried re-running specifically adding the gson jar as follows:
$ hadoop jar examples/target/mahout-examples-0.2-SNAPSHOT.job
org.apache.mahout.clustering.syntheticcontrol.kmeans.Job -libjars
examples/target/dependency/gson-1.3.jar
Unfortunately, I get the same errors as before:
09/07/1
I recently had these problems as well. I noticed that these errors were not
such a big deal in maps, but if they occur with a reduce, the job will
eventually fail. In my case, I noted that these errors came predominantly
from a single node and so I simply stopped using that node. I think that
th
Here's what I get. but I'm not loading any custom code:
bin/hadoop jar ~/projects/lucene/mahout/clean/examples/target/mahout-
examples-0.2-SNAPSHOT.job
org.apache.mahout.clustering.syntheticcontrol.kmeans.Job
Preparing Input
09/07/16 13:00:35 WARN mapred.JobClient: Use GenericOptionsParser f
My basic understanding of the class loader stuff is:
1. Any jars that need to be available to map/reduce jobs should be
specified through -libjars (e.g hadoop --config ... -libjars gson.jar
jar ...)
2. Any jars that need to be available to the main class should be
specified through lib/*.jar
Isn't this the same old problem that our Job jar file has a lib
directory with the Mahout code in it and the way Hadoop loads the jar it
sometimes cannot resolve classes in it? IIRC, one needs to smash the job
jar file into a single jar in order for Dirichlet (at least, and any
other examples w
Hmm, I'm not seeing the ClassNotFound problem but am getting fetch
failures. Will look later.
-Grant
On Jul 16, 2009, at 11:32 AM, Paul Ingles wrote:
I've just tried setting a brand new machine (Ubuntu 8.04 Virtual
Machine) with Hadoop 0.20.0 and running the compile jobs against it.
I ge
I've just tried setting a brand new machine (Ubuntu 8.04 Virtual
Machine) with Hadoop 0.20.0 and running the compile jobs against it. I
get the same problems as before... still scratching my head :(
On 16 Jul 2009, at 12:15, Paul Ingles wrote:
Sure,
I'm running (currently) on my MacBook Ai
Sure,
I'm running (currently) on my MacBook Air, running OSX Leopard.
JDK: java version "1.6.0_13"
Java(TM) SE Runtime Environment (build 1.6.0_13-b03-211)
Java HotSpot(TM) 64-Bit Server VM (build 11.3-b02-83, mixed mode)
Hadoop is: 0.20.0, r763504
I'm compiling mahout from trunk (r794023) as
Can you share how you built and how you are running, as in command
line options, etc.? Also, JDK version, Hadoop version, etc.
On Jul 16, 2009, at 6:21 AM, Paul Ingles wrote:
Hi,
Thank you for the suggestion. Unfortunately, when I tried that I
received the same error. I've also tried copy
Hi,
Thank you for the suggestion. Unfortunately, when I tried that I received the
same error. I've also tried copying the gson jar directly into $HADOOP_HOME/lib
(when I was running a single node pseudo-distributed) and get the same error
still.
Weirdly enough, if I try and run the Dirichlet e
try hadoop --config jar -libjars
Adil
Paul Ingles wrote:
Hi,
Apologies for the cross-posting (I also sent this to the Hadoop user
list) but I'm still getting errors if I try and run the KMeans
examples on a cluster, whether that be my single-node Mac Pro, or our
cluster. I've attached
Hi,
Apologies for the cross-posting (I also sent this to the Hadoop user
list) but I'm still getting errors if I try and run the KMeans
examples on a cluster, whether that be my single-node Mac Pro, or our
cluster. I've attached the stack trace at the bottom of the email.
The gson jar is
18 matches
Mail list logo