See <https://builds.apache.org/job/Mahout-Examples-Cluster-Reuters/301/changes>

Changes:

[ssc] MAHOUT-1205 ParallelALSFactorizationJob should leverage the distributed 
cache

[ssc] MAHOUT-1205 ParallelALSFactorizationJob should leverage the distributed 
cache

[ssc] adding missing changelog entries

------------------------------------------
[...truncated 1457 lines...]
INFO:     Reduce input records=0
May 07, 2013 10:09:50 AM org.apache.hadoop.mapred.Counters log
INFO:     Reduce input groups=0
May 07, 2013 10:09:50 AM org.apache.hadoop.mapred.Counters log
INFO:     Combine output records=0
May 07, 2013 10:09:50 AM org.apache.hadoop.mapred.Counters log
INFO:     Physical memory (bytes) snapshot=0
May 07, 2013 10:09:50 AM org.apache.hadoop.mapred.Counters log
INFO:     Reduce output records=0
May 07, 2013 10:09:50 AM org.apache.hadoop.mapred.Counters log
INFO:     Virtual memory (bytes) snapshot=0
May 07, 2013 10:09:50 AM org.apache.hadoop.mapred.Counters log
INFO:     Map output records=0
May 07, 2013 10:09:51 AM org.apache.hadoop.mapreduce.lib.input.FileInputFormat 
listStatus
INFO: Total input paths to process : 1
May 07, 2013 10:09:51 AM 
org.apache.hadoop.filecache.TrackerDistributedCacheManager downloadCacheObject
INFO: Creating frequency.file-0 in 
/tmp/hadoop-jenkins/mapred/local/archive/5841380952815181520_-299119410_2121790508/file/tmp/mahout-work-jenkins/reuters-out-seqdir-sparse-kmeans-work-2210575749095952834
 with rwxr-xr-x
May 07, 2013 10:09:51 AM 
org.apache.hadoop.filecache.TrackerDistributedCacheManager downloadCacheObject
INFO: Cached 
/tmp/mahout-work-jenkins/reuters-out-seqdir-sparse-kmeans/frequency.file-0 as 
/tmp/hadoop-jenkins/mapred/local/archive/5841380952815181520_-299119410_2121790508/file/tmp/mahout-work-jenkins/reuters-out-seqdir-sparse-kmeans/frequency.file-0
May 07, 2013 10:09:51 AM 
org.apache.hadoop.filecache.TrackerDistributedCacheManager 
localizePublicCacheObject
INFO: Cached 
/tmp/mahout-work-jenkins/reuters-out-seqdir-sparse-kmeans/frequency.file-0 as 
/tmp/hadoop-jenkins/mapred/local/archive/5841380952815181520_-299119410_2121790508/file/tmp/mahout-work-jenkins/reuters-out-seqdir-sparse-kmeans/frequency.file-0
May 07, 2013 10:09:51 AM org.apache.hadoop.mapred.JobClient monitorAndPrintJob
INFO: Running job: job_local_0006
May 07, 2013 10:09:51 AM org.apache.hadoop.mapred.Task initialize
INFO:  Using ResourceCalculatorPlugin : 
org.apache.hadoop.util.LinuxResourceCalculatorPlugin@19210089
May 07, 2013 10:09:51 AM org.apache.hadoop.mapred.MapTask$MapOutputBuffer <init>
INFO: io.sort.mb = 100
May 07, 2013 10:09:51 AM org.apache.hadoop.mapred.MapTask$MapOutputBuffer <init>
INFO: data buffer = 79691776/99614720
May 07, 2013 10:09:51 AM org.apache.hadoop.mapred.MapTask$MapOutputBuffer <init>
INFO: record buffer = 262144/327680
May 07, 2013 10:09:51 AM org.apache.hadoop.mapred.MapTask$MapOutputBuffer flush
INFO: Starting flush of map output
May 07, 2013 10:09:51 AM org.apache.hadoop.mapred.Task done
INFO: Task:attempt_local_0006_m_000000_0 is done. And is in the process of 
commiting
May 07, 2013 10:09:51 AM org.apache.hadoop.mapred.LocalJobRunner$Job 
statusUpdate
INFO: 
May 07, 2013 10:09:51 AM org.apache.hadoop.mapred.Task sendDone
INFO: Task 'attempt_local_0006_m_000000_0' done.
May 07, 2013 10:09:51 AM org.apache.hadoop.mapred.Task initialize
INFO:  Using ResourceCalculatorPlugin : 
org.apache.hadoop.util.LinuxResourceCalculatorPlugin@5252bb2d
May 07, 2013 10:09:51 AM org.apache.hadoop.mapred.LocalJobRunner$Job 
statusUpdate
INFO: 
May 07, 2013 10:09:51 AM org.apache.hadoop.mapred.Merger$MergeQueue merge
INFO: Merging 1 sorted segments
May 07, 2013 10:09:51 AM org.apache.hadoop.mapred.Merger$MergeQueue merge
INFO: Down to the last merge-pass, with 0 segments left of total size: 0 bytes
May 07, 2013 10:09:51 AM org.apache.hadoop.mapred.LocalJobRunner$Job 
statusUpdate
INFO: 
May 07, 2013 10:09:51 AM org.apache.hadoop.mapred.Task done
INFO: Task:attempt_local_0006_r_000000_0 is done. And is in the process of 
commiting
May 07, 2013 10:09:51 AM org.apache.hadoop.mapred.LocalJobRunner$Job 
statusUpdate
INFO: 
May 07, 2013 10:09:51 AM org.apache.hadoop.mapred.Task commit
INFO: Task attempt_local_0006_r_000000_0 is allowed to commit now
May 07, 2013 10:09:51 AM 
org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter commitTask
INFO: Saved output of task 'attempt_local_0006_r_000000_0' to 
/tmp/mahout-work-jenkins/reuters-out-seqdir-sparse-kmeans/partial-vectors-0
May 07, 2013 10:09:51 AM org.apache.hadoop.mapred.LocalJobRunner$Job 
statusUpdate
INFO: reduce > reduce
May 07, 2013 10:09:51 AM org.apache.hadoop.mapred.Task sendDone
INFO: Task 'attempt_local_0006_r_000000_0' done.
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.JobClient monitorAndPrintJob
INFO:  map 100% reduce 100%
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.JobClient monitorAndPrintJob
INFO: Job complete: job_local_0006
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Counters log
INFO: Counters: 20
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Counters log
INFO:   File Output Format Counters 
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Counters log
INFO:     Bytes Written=102
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Counters log
INFO:   FileSystemCounters
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Counters log
INFO:     FILE_BYTES_READ=765226509
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Counters log
INFO:     FILE_BYTES_WRITTEN=771814596
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Counters log
INFO:   File Input Format Counters 
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Counters log
INFO:     Bytes Read=102
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Counters log
INFO:   Map-Reduce Framework
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Counters log
INFO:     Map output materialized bytes=6
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Counters log
INFO:     Map input records=0
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Counters log
INFO:     Reduce shuffle bytes=0
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Counters log
INFO:     Spilled Records=0
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Counters log
INFO:     Map output bytes=0
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Counters log
INFO:     Total committed heap usage (bytes)=1288044544
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Counters log
INFO:     CPU time spent (ms)=0
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Counters log
INFO:     SPLIT_RAW_BYTES=151
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Counters log
INFO:     Combine input records=0
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Counters log
INFO:     Reduce input records=0
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Counters log
INFO:     Reduce input groups=0
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Counters log
INFO:     Combine output records=0
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Counters log
INFO:     Physical memory (bytes) snapshot=0
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Counters log
INFO:     Reduce output records=0
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Counters log
INFO:     Virtual memory (bytes) snapshot=0
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Counters log
INFO:     Map output records=0
May 07, 2013 10:09:52 AM org.apache.hadoop.mapreduce.lib.input.FileInputFormat 
listStatus
INFO: Total input paths to process : 1
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.JobClient monitorAndPrintJob
INFO: Running job: job_local_0007
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Task initialize
INFO:  Using ResourceCalculatorPlugin : 
org.apache.hadoop.util.LinuxResourceCalculatorPlugin@1a3d8b15
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.MapTask$MapOutputBuffer <init>
INFO: io.sort.mb = 100
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.MapTask$MapOutputBuffer <init>
INFO: data buffer = 79691776/99614720
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.MapTask$MapOutputBuffer <init>
INFO: record buffer = 262144/327680
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.MapTask$MapOutputBuffer flush
INFO: Starting flush of map output
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Task done
INFO: Task:attempt_local_0007_m_000000_0 is done. And is in the process of 
commiting
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.LocalJobRunner$Job 
statusUpdate
INFO: 
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Task sendDone
INFO: Task 'attempt_local_0007_m_000000_0' done.
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Task initialize
INFO:  Using ResourceCalculatorPlugin : 
org.apache.hadoop.util.LinuxResourceCalculatorPlugin@21fd3d92
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.LocalJobRunner$Job 
statusUpdate
INFO: 
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Merger$MergeQueue merge
INFO: Merging 1 sorted segments
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Merger$MergeQueue merge
INFO: Down to the last merge-pass, with 0 segments left of total size: 0 bytes
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.LocalJobRunner$Job 
statusUpdate
INFO: 
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Task done
INFO: Task:attempt_local_0007_r_000000_0 is done. And is in the process of 
commiting
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.LocalJobRunner$Job 
statusUpdate
INFO: 
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Task commit
INFO: Task attempt_local_0007_r_000000_0 is allowed to commit now
May 07, 2013 10:09:52 AM 
org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter commitTask
INFO: Saved output of task 'attempt_local_0007_r_000000_0' to 
/tmp/mahout-work-jenkins/reuters-out-seqdir-sparse-kmeans/tfidf-vectors
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.LocalJobRunner$Job 
statusUpdate
INFO: reduce > reduce
May 07, 2013 10:09:52 AM org.apache.hadoop.mapred.Task sendDone
INFO: Task 'attempt_local_0007_r_000000_0' done.
May 07, 2013 10:09:53 AM org.apache.hadoop.mapred.JobClient monitorAndPrintJob
INFO:  map 100% reduce 100%
May 07, 2013 10:09:53 AM org.apache.hadoop.mapred.JobClient monitorAndPrintJob
INFO: Job complete: job_local_0007
May 07, 2013 10:09:53 AM org.apache.hadoop.mapred.Counters log
INFO: Counters: 20
May 07, 2013 10:09:53 AM org.apache.hadoop.mapred.Counters log
INFO:   File Output Format Counters 
May 07, 2013 10:09:53 AM org.apache.hadoop.mapred.Counters log
INFO:     Bytes Written=102
May 07, 2013 10:09:53 AM org.apache.hadoop.mapred.Counters log
INFO:   FileSystemCounters
May 07, 2013 10:09:53 AM org.apache.hadoop.mapred.Counters log
INFO:     FILE_BYTES_READ=892764204
May 07, 2013 10:09:53 AM org.apache.hadoop.mapred.Counters log
INFO:     FILE_BYTES_WRITTEN=900449664
May 07, 2013 10:09:53 AM org.apache.hadoop.mapred.Counters log
INFO:   File Input Format Counters 
May 07, 2013 10:09:53 AM org.apache.hadoop.mapred.Counters log
INFO:     Bytes Read=102
May 07, 2013 10:09:53 AM org.apache.hadoop.mapred.Counters log
INFO:   Map-Reduce Framework
May 07, 2013 10:09:53 AM org.apache.hadoop.mapred.Counters log
INFO:     Map output materialized bytes=6
May 07, 2013 10:09:53 AM org.apache.hadoop.mapred.Counters log
INFO:     Map input records=0
May 07, 2013 10:09:53 AM org.apache.hadoop.mapred.Counters log
INFO:     Reduce shuffle bytes=0
May 07, 2013 10:09:53 AM org.apache.hadoop.mapred.Counters log
INFO:     Spilled Records=0
May 07, 2013 10:09:53 AM org.apache.hadoop.mapred.Counters log
INFO:     Map output bytes=0
May 07, 2013 10:09:53 AM org.apache.hadoop.mapred.Counters log
INFO:     Total committed heap usage (bytes)=1487405056
May 07, 2013 10:09:53 AM org.apache.hadoop.mapred.Counters log
INFO:     CPU time spent (ms)=0
May 07, 2013 10:09:53 AM org.apache.hadoop.mapred.Counters log
INFO:     SPLIT_RAW_BYTES=158
May 07, 2013 10:09:53 AM org.apache.hadoop.mapred.Counters log
INFO:     Combine input records=0
May 07, 2013 10:09:53 AM org.apache.hadoop.mapred.Counters log
INFO:     Reduce input records=0
May 07, 2013 10:09:53 AM org.apache.hadoop.mapred.Counters log
INFO:     Reduce input groups=0
May 07, 2013 10:09:53 AM org.apache.hadoop.mapred.Counters log
INFO:     Combine output records=0
May 07, 2013 10:09:53 AM org.apache.hadoop.mapred.Counters log
INFO:     Physical memory (bytes) snapshot=0
May 07, 2013 10:09:53 AM org.apache.hadoop.mapred.Counters log
INFO:     Reduce output records=0
May 07, 2013 10:09:53 AM org.apache.hadoop.mapred.Counters log
INFO:     Virtual memory (bytes) snapshot=0
May 07, 2013 10:09:53 AM org.apache.hadoop.mapred.Counters log
INFO:     Map output records=0
May 07, 2013 10:09:53 AM org.slf4j.impl.JCLLoggerAdapter info
INFO: Deleting 
/tmp/mahout-work-jenkins/reuters-out-seqdir-sparse-kmeans/partial-vectors-0
May 07, 2013 10:09:53 AM org.slf4j.impl.JCLLoggerAdapter info
INFO: Program took 11532 ms (Minutes: 0.1922)
hadoop binary is not in PATH,HADOOP_HOME/bin,HADOOP_PREFIX/bin, running locally
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in 
[jar:<https://builds.apache.org/job/Mahout-Examples-Cluster-Reuters/ws/trunk/examples/target/mahout-examples-0.8-SNAPSHOT-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
SLF4J: Found binding in 
[jar:<https://builds.apache.org/job/Mahout-Examples-Cluster-Reuters/ws/trunk/examples/target/dependency/slf4j-jcl-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.JCLLoggerFactory]
May 07, 2013 10:09:54 AM org.slf4j.impl.JCLLoggerAdapter info
INFO: Command line arguments: {--clustering=null, 
--clusters=[/tmp/mahout-work-jenkins/reuters-kmeans-clusters], 
--convergenceDelta=[0.5], 
--distanceMeasure=[org.apache.mahout.common.distance.CosineDistanceMeasure], 
--endPhase=[2147483647], 
--input=[/tmp/mahout-work-jenkins/reuters-out-seqdir-sparse-kmeans/tfidf-vectors/],
 --maxIter=[10], --method=[mapreduce], --numClusters=[20], 
--output=[/tmp/mahout-work-jenkins/reuters-kmeans], --overwrite=null, 
--startPhase=[0], --tempDir=[temp]}
May 07, 2013 10:09:54 AM org.apache.hadoop.util.NativeCodeLoader <clinit>
WARNING: Unable to load native-hadoop library for your platform... using 
builtin-java classes where applicable
May 07, 2013 10:09:54 AM org.apache.hadoop.io.compress.CodecPool getCompressor
INFO: Got brand-new compressor
May 07, 2013 10:09:54 AM org.slf4j.impl.JCLLoggerAdapter info
INFO: Wrote 20 Klusters to 
/tmp/mahout-work-jenkins/reuters-kmeans-clusters/part-randomSeed
May 07, 2013 10:09:54 AM org.slf4j.impl.JCLLoggerAdapter info
INFO: Input: 
/tmp/mahout-work-jenkins/reuters-out-seqdir-sparse-kmeans/tfidf-vectors 
Clusters In: /tmp/mahout-work-jenkins/reuters-kmeans-clusters/part-randomSeed 
Out: /tmp/mahout-work-jenkins/reuters-kmeans Distance: 
org.apache.mahout.common.distance.CosineDistanceMeasure
May 07, 2013 10:09:54 AM org.slf4j.impl.JCLLoggerAdapter info
INFO: convergence: 0.5 max Iterations: 10 num Reduce Tasks: 
org.apache.mahout.math.VectorWritable Input Vectors: {}
May 07, 2013 10:09:54 AM org.apache.hadoop.io.compress.CodecPool getDecompressor
INFO: Got brand-new decompressor
Exception in thread "main" java.lang.IllegalStateException: No input clusters 
found in /tmp/mahout-work-jenkins/reuters-kmeans-clusters/part-randomSeed. 
Check your -c argument.
        at 
org.apache.mahout.clustering.kmeans.KMeansDriver.buildClusters(KMeansDriver.java:217)
        at 
org.apache.mahout.clustering.kmeans.KMeansDriver.run(KMeansDriver.java:148)
        at 
org.apache.mahout.clustering.kmeans.KMeansDriver.run(KMeansDriver.java:107)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
        at 
org.apache.mahout.clustering.kmeans.KMeansDriver.main(KMeansDriver.java:48)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:601)
        at 
org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
        at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
        at org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:195)
Build step 'Execute shell' marked build as failure

Reply via email to