>> org.apache.mahout.math.CardinalityException: Required cardinality 50 but got 5
apparently your rows are of size 50, not 5? -jake On Thu, Jun 16, 2011 at 3:16 PM, stefan.bobocescu < [email protected]> wrote: > i'm trying to get the eigen vector using mahout without hadoop. Is this a > problem? > > If no, > when im giving > stefan@stefan-desktop:~/Desktop/trunk$ ./bin/mahout svd -i > /home/stefan/Desktop/mahoutinput -o /home/stefan/Desktop/mahoutoutput -r 2 > --numCols 5 --numRows 5 > i obtain: > > no HADOOP_HOME set, running locally > SLF4J: Class path contains multiple SLF4J bindings. > SLF4J: Found binding in > > [jar:file:/home/stefan/Desktop/trunk/examples/target/mahout-examples-0.6-SNAPSHOT-job.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: Found binding in > > [jar:file:/home/stefan/Desktop/trunk/examples/target/dependency/slf4j-jcl-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class] > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an > explanation. > Jun 17, 2011 12:58:55 AM org.slf4j.impl.JCLLoggerAdapter info > INFO: Command line arguments: {--endPhase=2147483647, --inMemory=false, > --input=/home/stefan/Desktop/mahoutoutput, --maxError=0.05, > --minEigenvalue=0.0, --numCols=5, --numRows=5, > --output=/home/stefan/Desktop/mahoutinput, --rank=2, --startPhase=0, > --tempDir=temp} > Jun 17, 2011 12:58:55 AM org.slf4j.impl.JCLLoggerAdapter info > INFO: Finding 2 singular vectors of matrix with 5 rows, via Lanczos > Jun 17, 2011 12:58:55 AM org.apache.hadoop.metrics.jvm.JvmMetrics init > INFO: Initializing JVM Metrics with processName=JobTracker, sessionId= > Jun 17, 2011 12:58:55 AM org.apache.hadoop.mapred.FileInputFormat > listStatus > INFO: Total input paths to process : 2 > Jun 17, 2011 12:58:56 AM org.apache.hadoop.mapred.JobClient > monitorAndPrintJob > INFO: Running job: job_local_0001 > Jun 17, 2011 12:58:56 AM org.apache.hadoop.mapred.FileInputFormat > listStatus > INFO: Total input paths to process : 2 > Jun 17, 2011 12:58:56 AM org.apache.hadoop.mapred.MapTask runOldMapper > INFO: numReduceTasks: 1 > Jun 17, 2011 12:58:56 AM org.apache.hadoop.mapred.MapTask$MapOutputBuffer > <init> > INFO: io.sort.mb = 100 > Jun 17, 2011 12:58:56 AM org.apache.hadoop.mapred.MapTask$MapOutputBuffer > <init> > INFO: data buffer = 79691776/99614720 > Jun 17, 2011 12:58:56 AM org.apache.hadoop.mapred.MapTask$MapOutputBuffer > <init> > INFO: record buffer = 262144/327680 > Jun 17, 2011 12:58:56 AM org.apache.hadoop.mapred.LocalJobRunner$Job run > WARNING: job_local_0001 > org.apache.mahout.math.CardinalityException: Required cardinality 50 but > got > 5 > at org.apache.mahout.math.DenseVector.dot(DenseVector.java:241) > at org.apache.mahout.math.NamedVector.dot(NamedVector.java:142) > at > > org.apache.mahout.math.hadoop.TimesSquaredJob$TimesSquaredMapper.scale(TimesSquaredJob.java:243) > at > > org.apache.mahout.math.hadoop.TimesSquaredJob$TimesSquaredMapper.map(TimesSquaredJob.java:234) > at > > org.apache.mahout.math.hadoop.TimesSquaredJob$TimesSquaredMapper.map(TimesSquaredJob.java:196) > at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50) > at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:358) > at org.apache.hadoop.mapred.MapTask.run(MapTask.java:307) > at > org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:177) > Jun 17, 2011 12:58:57 AM org.apache.hadoop.mapred.JobClient > monitorAndPrintJob > INFO: map 0% reduce 0% > Jun 17, 2011 12:58:57 AM org.apache.hadoop.mapred.JobClient > monitorAndPrintJob > INFO: Job complete: job_local_0001 > Jun 17, 2011 12:58:57 AM org.apache.hadoop.mapred.Counters log > INFO: Counters: 0 > Exception in thread "main" java.lang.IllegalStateException: > java.io.IOException: Job failed! > at > > org.apache.mahout.math.hadoop.DistributedRowMatrix.timesSquared(DistributedRowMatrix.java:240) > at > > org.apache.mahout.math.decomposer.lanczos.LanczosSolver.solve(LanczosSolver.java:104) > at > > org.apache.mahout.math.hadoop.decomposer.DistributedLanczosSolver.run(DistributedLanczosSolver.java:212) > at > > org.apache.mahout.math.hadoop.decomposer.DistributedLanczosSolver.run(DistributedLanczosSolver.java:134) > at > > org.apache.mahout.math.hadoop.decomposer.DistributedLanczosSolver$DistributedLanczosSolverJob.run(DistributedLanczosSolver.java:295) > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79) > at > > org.apache.mahout.math.hadoop.decomposer.DistributedLanczosSolver.main(DistributedLanczosSolver.java:301) > at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > at > > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > at > > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:616) > at > > org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68) > at > org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139) > at org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:188) > Caused by: java.io.IOException: Job failed! > at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1252) > at org.apache.mahout.math.hadoop.DistributedRowMatrix..... > > Can somebody help me to obtain the rawEigenVector. Thank you. > > -- > View this message in context: > http://lucene.472066.n3.nabble.com/mahouts-svd-tp3074218p3074218.html > Sent from the Mahout User List mailing list archive at Nabble.com. >
