See <https://builds.apache.org/job/Mahout-Quality/3525/display/redirect?page=changes>
Changes: [rawkintrevo] rebased off of master fresh [rawkintrevo] viennacl and viennacl-omp Pall-scala [rawkintrevo] better cleaning [rawkintrevo] failing on h2o, but works otherwise [rawkintrevo] added tests [rawkintrevo] fix antrun copy issue on h2o [rawkintrevo] h2o dependency reduced [rawkintrevo] add scala version to spark profiles or math-scala doesn't build [rawkintrevo] moved integration module to end [rawkintrevo] moved distribution module to end [rawkintrevo] distribution was looking for jar named with classifier [trevor.d.grant] throwing poo at the wall [trevor.d.grant] still broken, but profile order matters ------------------------------------------ [...truncated 320.30 KB...] X2 -4.15265 +1.78491 -2.32654 +0.08056 X3 -5.67991 +1.88687 -3.01022 +0.03954 X4 +163.17933 +51.91530 +3.14318 +0.03474 F-statistic: 16.385423615806292 on 4 and 4 DF, p-value: 0.009545 Mean Squared Error: 6.4571565392236 R^2: 0.9424805502529814 [32m- fittness tests[0m [32m- durbinWatsonTest test[0m [32mRegressionSuite:[0m [32m- ordinary least squares[0m [32m- cochrane-orcutt[0m [32mBlasSuite:[0m AB' num partitions = 3. { 0 => {0:26.0,1:38.0} 1 => {0:38.0,1:56.0} 2 => {0:50.0,1:74.0} } [32m- ABt[0m [32m- A * B Hadamard[0m [32m- A + B Elementwise[0m [32m- A - B Elementwise[0m [32m- A / B Elementwise[0m { 0 => {0:5.0,1:8.0} 1 => {0:8.0,1:13.0} } { 0 => {0:5.0,1:8.0} 1 => {0:8.0,1:13.0} } [32m- AtA slim[0m { 0 => {0:1.0,1:2.0,2:3.0} 1 => {0:2.0,1:3.0,2:4.0} 2 => {0:3.0,1:4.0,2:5.0} } [32m- At[0m [32m- verbosity[0m [32mSimilarityAnalysisSuite:[0m [32m- Cross-occurrence [A'A], [B'A] boolean data using LLR[0m [32m- Cross-occurrence [A'A], [B'A] double data using LLR[0m [32m- Cross-occurrence [A'A], [B'A] integer data using LLR[0m [32m- Cross-occurrence two matrices with different number of columns[0m [32m- Cross-occurrence two IndexedDatasets[0m [32m- Cross-occurrence two IndexedDatasets different row ranks[0m [32m- Cross-occurrence two IndexedDatasets LLR threshold[0m [32m- LLR calc[0m [32m- downsampling by number per row[0m [32mClassifierStatsSparkTestSuite:[0m [32m- testFullRunningAverageAndStdDev[0m [32m- testBigFullRunningAverageAndStdDev[0m [32m- testStddevFullRunningAverageAndStdDev[0m [32m- testFullRunningAverage[0m [32m- testFullRunningAveragCopyConstructor[0m [32m- testInvertedRunningAverage[0m [32m- testInvertedRunningAverageAndStdDev[0m [32m- testBuild[0m [32m- GetMatrix[0m [32m- testPrecisionRecallAndF1ScoreAsScikitLearn[0m [32mRLikeDrmOpsSuite:[0m [32m- A.t[0m { 0 => {0:11.0,1:17.0} 1 => {0:25.0,1:39.0} } { 0 => {0:11.0,1:17.0} 1 => {0:25.0,1:39.0} } [32m- C = A %*% B[0m { 0 => {0:11.0,1:17.0} 1 => {0:25.0,1:39.0} } { 0 => {0:11.0,1:17.0} 1 => {0:25.0,1:39.0} } Q= { 0 => {0:0.40273861426601687,1:-0.9153150324187648} 1 => {0:0.9153150324227656,1:0.40273861426427493} } [32m- C = A %*% B mapBlock {}[0m [32m- C = A %*% B incompatible B keys[0m [32m- Spark-specific C = At %*% B , join[0m [32m- C = At %*% B , join, String-keyed[0m [32m- C = At %*% B , zippable, String-keyed[0m [32m- C = A %*% B.t[0m { 0 => {0:26.0,1:35.0,2:46.0,3:51.0} 1 => {0:50.0,1:69.0,2:92.0,3:105.0} 2 => {0:62.0,1:86.0,2:115.0,3:132.0} 3 => {0:74.0,1:103.0,2:138.0,3:159.0} } [32m- C = A %*% inCoreB[0m { 0 => {0:26.0,1:35.0,2:46.0,3:51.0} 1 => {0:50.0,1:69.0,2:92.0,3:105.0} 2 => {0:62.0,1:86.0,2:115.0,3:132.0} 3 => {0:74.0,1:103.0,2:138.0,3:159.0} } [32m- C = inCoreA %*%: B[0m [32m- C = A.t %*% A[0m [32m- C = A.t %*% A fat non-graph[0m [32m- C = A.t %*% A non-int key[0m [32m- C = A + B[0m A= { 0 => {0:1.0,1:2.0,2:3.0} 1 => {0:3.0,1:4.0,2:5.0} 2 => {0:5.0,1:6.0,2:7.0} } B= { 0 => {0:0.9642417991285117,1:0.8912109669210633,2:0.9068284737240229} 1 => {0:0.21510881935800774,1:0.2520809477156928,2:0.11200953068118802} 2 => {0:0.5065984433451026,1:0.9865993528224783,2:0.02122628475914201} } C= { 0 => {0:1.9642417991285117,1:2.8912109669210633,2:3.9068284737240226} 1 => {0:3.215108819358008,1:4.252080947715693,2:5.112009530681188} 2 => {0:5.506598443345102,1:6.986599352822479,2:7.021226284759142} } [32m- C = A + B, identically partitioned[0m [32m- C = A + B side test 1[0m [32m- C = A + B side test 2[0m [32m- C = A + B side test 3[0m [32m- Ax[0m [32m- A'x[0m [32m- colSums, colMeans[0m [32m- rowSums, rowMeans[0m [32m- A.diagv[0m [32m- numNonZeroElementsPerColumn[0m [32m- C = A cbind B, cogroup[0m [32m- C = A cbind B, zip[0m [32m- B = 1 cbind A[0m [32m- B = A cbind 1[0m [32m- B = A + 1.0[0m [32m- C = A rbind B[0m [32m- C = A rbind B, with empty[0m [32m- scalarOps[0m optimized:OpAewUnaryFunc(org.apache.mahout.sparkbindings.drm.CheckpointedDrmSpark@33015f28,<function1>,false) [32m- A * A -> sqr(A) rewrite [0m optimizer rewritten:OpAewUnaryFuncFusion(org.apache.mahout.sparkbindings.drm.CheckpointedDrmSpark@35c1444b,List(OpAewUnaryFunc(OpAewUnaryFunc(OpAewUnaryFunc(org.apache.mahout.sparkbindings.drm.CheckpointedDrmSpark@35c1444b,<function1>,false),<function1>,false),<function1>,true), OpAewUnaryFunc(OpAewUnaryFunc(org.apache.mahout.sparkbindings.drm.CheckpointedDrmSpark@35c1444b,<function1>,false),<function1>,false), OpAewUnaryFunc(org.apache.mahout.sparkbindings.drm.CheckpointedDrmSpark@35c1444b,<function1>,false))) [32m- B = 1 + 2 * (A * A) ew unary function fusion[0m [32m- functional apply()[0m [32m- C = A + B missing rows[0m [32m- C = cbind(A, B) with missing rows[0m collected A = { 0 => {0:1.0,1:2.0,2:3.0} 1 => {} 2 => {} 3 => {0:3.0,1:4.0,2:5.0} } collected B = { 0 => {0:2.0,1:3.0,2:4.0} 1 => {0:1.0,1:1.0,2:1.0} 2 => {0:1.0,1:1.0,2:1.0} 3 => {0:4.0,1:5.0,2:6.0} } [32m- B = A + 1.0 missing rows[0m in-core mul ms: 1256 a'b plan:OpAtB(org.apache.mahout.sparkbindings.drm.CheckpointedDrmSpark@68396a6a,org.apache.mahout.sparkbindings.drm.CheckpointedDrmSpark@11798a9b) a'b plan contains 3 partitions. distributed mul ms: 2339. [32m- A'B, bigger[0m [32m- C = At %*% B , zippable[0m [32mTextDelimitedReaderWriterSuite:[0m [32m- indexedDatasetDFSRead should read sparse matrix file with null rows[0m [32mPreprocessorSuite:[0m OpMapBlock(org.apache.mahout.sparkbindings.drm.CheckpointedDrmSpark@3fe34ebd,<function1>,8,-1,true) {1:3.0,2:5.0,3:6.0} [32m- asfactor test[0m {0:2.0,1:5.0,2:-4.0} {0:0.8164965809277263,1:3.2659863237109037,2:8.286535263104035} [32m- standard scaler test[0m [32m- mean center test[0m [32mTFIDFSparkTestSuite:[0m [32m- TF test[0m [32m- TFIDF test[0m [32m- MLlib TFIDF test[0m [36mRun completed in 1 minute, 56 seconds.[0m [36mTotal number of tests run: 131[0m [36mSuites: completed 19, aborted 0[0m [36mTests: succeeded 131, failed 0, canceled 0, ignored 1, pending 0[0m [32mAll tests passed.[0m [INFO] [INFO] --- build-helper-maven-plugin:1.9.1:remove-project-artifact (remove-old-mahout-artifacts) @ mahout-spark_2.10 --- [INFO] /home/jenkins/.m2/repository/org/apache/mahout/mahout-spark_2.10 removed. [INFO] [INFO] --- maven-site-plugin:3.4:attach-descriptor (attach-descriptor) @ mahout-spark_2.10 --- [INFO] [INFO] --- maven-antrun-plugin:1.4:run (copy) @ mahout-spark_2.10 --- project.artifactId [INFO] Executing tasks [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Skipping Apache Mahout [INFO] This project has been banned from the build due to previous failures. [INFO] ------------------------------------------------------------------------ [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Skipping Mahout Build Tools [INFO] This project has been banned from the build due to previous failures. [INFO] ------------------------------------------------------------------------ [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Mahout Build Tools ................................. SUCCESS [ 2.910 s] [INFO] Apache Mahout ...................................... SUCCESS [ 0.796 s] [INFO] Mahout Math ........................................ SUCCESS [01:37 min] [INFO] Mahout HDFS ........................................ SUCCESS [ 6.212 s] [INFO] Mahout Map-Reduce .................................. SUCCESS [12:34 min] [INFO] Mahout Integration ................................. SUCCESS [01:02 min] [INFO] Mahout Examples .................................... SUCCESS [ 28.389 s] [INFO] Mahout Math Scala bindings ......................... SUCCESS [04:56 min] [INFO] Mahout Spark bindings .............................. FAILURE [02:59 min] [INFO] Mahout H2O backend ................................. SKIPPED [INFO] Mahout Release Package ............................. SKIPPED [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 23:52 min [INFO] Finished at: 2017-12-18T04:10:55+00:00 [INFO] Final Memory: 69M/965M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-antrun-plugin:1.4:run (copy) on project mahout-spark_2.10: An Ant BuildException has occured: Only one of tofile and todir may be set. -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException [ERROR] [ERROR] After correcting the problems, you can resume the build with the command [ERROR] mvn <goals> -rf :mahout-spark_2.10 Build step 'Invoke top-level Maven targets' marked build as failure [PMD] Skipping publisher since build result is FAILURE [TASKS] Skipping publisher since build result is FAILURE Archiving artifacts [Fast Archiver] Compressed 153.73 MB of artifacts by 89.2% relative to #3524 Recording test results Publishing Javadoc