See <https://builds.apache.org/job/Mahout-Quality/3324/changes>

Changes:

[alexey.s.grigoriev] MAHOUT-1570: initial skeleton for Mahout DSL on Apache 
Flink

[alexey.s.grigoriev] MAHOUT-1712: Flink: Ax, At, Atx operators

[alexey.s.grigoriev] MAHOUT-1701: Flink: AtB implemented, ABt and AtA expressed 
via AtB

[alexey.s.grigoriev] MAHOUT-1702: Flink: AewScalar and AewB

[alexey.s.grigoriev] MAHOUT-1703: Flink: cbind, rbind and mapBlock

[alexey.s.grigoriev] MAHOUT-1709: Flink: slicing operator

[alexey.s.grigoriev] MAHOUT-1710: Flink: A times incoreB operator

[alexey.s.grigoriev] MAHOUT-1570: Flink: calculating ncol, nrow; colSum, 
colMean, norm

[alexey.s.grigoriev] MAHOUT-1570: rebased to latest upstream

[alexey.s.grigoriev] NOJIRA: added .cache to gitignore

[alexey.s.grigoriev] MAHOUT-1570: upgraded to flink 0.9-SNAPSHOT for IO

[alexey.s.grigoriev] MAHOUT-1734: Flink: DRM IO

[alexey.s.grigoriev] MAHOUT-1570: Flink: added headers, comments and 
aknowledgements

[alexey.s.grigoriev] MAHOUT-1711: Flink: drmBroadcast implemented

[alexey.s.grigoriev] MAHOUT-1570: rebased to latest upstream

[alexey.s.grigoriev] MAHOUT-1702: Flink: AewScalar replaced with OpAewUnaryFunc

[alexey.s.grigoriev] MAHOUT-1570: Flink: imports cleaned

[alexey.s.grigoriev] MAHOUT-1570: Flink: nrow and ncol optimized

[alexey.s.grigoriev]  MAHOUT-1570: rebased to latest upstream

[alexey.s.grigoriev] MAHOUT-1764: Flink: standard backend tests

[alexey.s.grigoriev] MAHOUT-1751: Flink: AtA slim

[alexey.s.grigoriev] MAHOUT-1703: Flink: cbind with scalar

[alexey.s.grigoriev] MAHOUT-1702: Flink: unary functions

[alexey.s.grigoriev] MAHOUT-1701: Flink: AtB bug fixed

[alexey.s.grigoriev] MAHOUT-1570: Flink: drmParallelizeEmpty and extra tests

[alexey.s.grigoriev] MAHOUT-1570: Flink: numNonZeroElementsPerColumn

[alexey.s.grigoriev] NOJIRA: "spart specific" comment is no longer true - 
removed it

[alexey.s.grigoriev] MAHOUT-1747: Flink: support for different key types

[alexey.s.grigoriev] MAHOUT-1701: Flink: bug with AtB

[alexey.s.grigoriev] MAHOUT-1751: Flink: fat AtA

[a.grigorev] MAHOUT-1570: fixed maven issue

[dlyubimov] Unifying "keyClassTag" of checkpoitns and "classTagK" of logical

[smarthi] 1. Changed poms to next release version - 0.12.0 2. Changed the flink

[smarthi] MAHOUT-1776 Refactor common Engine agnostic classes to Math-Scala 
module

[dlyubimov] A small renaming of methods of DrmRddInput

[smarthi] 1. Reworked FlinkEngine.drmDfsRead 2. small renaming of methods in

[smarthi] Refactored FinkEngine.drmDfsRead(), closes apache/mahout #165

[smarthi] Removed unused imports

[smarthi] MAHOUT-1775 FileNotFoundException caused by aborting the process of

[smarthi] MAHOUT-1775 FileNotFoundException caused by aborting the process of

[smarthi] WIP, Mahout-Flink Integration, adding missing methods; code 
refactoring

[smarthi] WIP, Flink-Mahout integration, created a decorator; this closes

[smarthi] MAHOUT-1781 Dense matrix view multiplication is 4x slower than 
non-view

[smarthi] MAHOUT-1785: Replace 'spark.kryoserializer.buffer.mb' from Spark 
config

[smarthi] NOJIRA: minor fixes

[smarthi] WIP, migrating to Flink 0.10 and the Flink Scala API

[smarthi] WIP, migrating to Flink 0.10 and the Flink Scala API

[smarthi] NOJira: Add missing license header

[smarthi] MAHOUT-1793: Declare WORK_DIR earlier in example script to fix putput

[smarthi] MAHOUT-1797: Typos for SPARK_ASSEMBLY_BIN

[smarthi] MAHOUT-1800: Pare down Casstag overuse closes apache/mahout#183

[smarthi] NoJIRA: scala code cleanup from the previous commits

[smarthi] (nojira) upgrade to Spark 1.5.2

[smarthi] MAHOUT-1640:Better collections would significantly improve

[smarthi] MAHOUT-1801:FastUtil to improve speed of Sparse Matrix Operations,

[smarthi] NoJira: Add fastutil to Spark reduced-dependency assembly

[smarthi] MAHOUT-1802: Capture attached checkpoints (if cached) closes

[smarthi] (nojira) bump shell version to 0.11.2

[smarthi] NoJira: removed unnecessary references and imports of scala.ClassTag

[smarthi] Merge changes from master

[smarthi] Major fixes for Flink backend merged

[smarthi] MAHOUT-1747: Mahout DSL for Flink: add support for different types of

[apalumbo] MAHOUT-1811: Fix DRM norm formula closes apache/mahout#192

[smarthi] MAHOUT-1812: Implement drmParallelizeEmptyLong(...) in flink Bindings

[apalumbo] MAHOUT-1805: implement allreduceBlock in Flink closes 
apache/mahout#193

[apalumbo] MAHOUT-1805: Fix previous conflict resolution error

[smarthi] NoJira: Fix the compile issue from previous commit

[apalumbo] MAHOUT-1808: Some cleanup of unused operations in sparkbindings 
closes

[apalumbo] MAHOUT-1807: Distributed second norm doesn't take sqrt closes

[apalumbo] Implicit checkpoint must not request caching closes apache/mahout#188

[smarthi] NoJira: minor fixes

[apalumbo] MAHOUT-1813:  Functional 'apply' DSL for distributed and in-memory

[smarthi] Update Spark Shell for 0.12.0

[apalumbo] MAHOUT-1815: dsqDist(X,Y) and dsqDist(X) failing in flink tests. 
closes

[akm] MAHOUT-1773: Fix cluster-syntheticcontrol.sh for HDFS synthax closes

[akm] Merge branch 'mahoutworkdir' into MAHOUT-1794; pulling PR into a branch

[akm] Adding instructions for MAHOUT-1794 to the readme.

[akm] MAHOUT-1794 Closing PR closes apache/mahout#178

[apalumbo] MAHOUT-1810: Failing test in flink-bindings: A + B Identically

[smarthi] MAHOUT-1816:Implement newRowCardinality in CheckpointedFlinkDrm, this

[apalumbo] MAHOUT-1809: Bump JVM memory up to 4g for flink scalatests closes

[apalumbo] MAHOUT-1810: Use method taken from FlinkMLTools for 
CheckpointedFlinkDrm

[smarthi] Some minor fixes, this closes apache/mahout#202

[apalumbo] Small change addressing DL's comment on apache/mahout#200, also a 
small

[apalumbo] Persist only if the dataset has not been cached.  Otherwise read 
back in

[apalumbo] add unchace

[apalumbo] use  as the base directory for cached files

[apalumbo] wip: use properties from

[apalumbo] wip: use properties from

[apalumbo] move getMahoutHome()

[apalumbo] MAHOUT-1749 Mahout DSL for Flink: Implement Atx closes 
apache/mahout#204

[apalumbo] Comments, cleanup

[apalumbo] comment out parallization settting in cache()

[apalumbo] revert commits:  48a8a8208322ed690d5356a4e0cac7667b080bab 

[apalumbo] MAHOUT-1817: optimize caching workaround for Flink,  squashed commit 
of

[smarthi] NoJira: Minor fixes for style, closes apache/mahout#205

[smarthi] MAHOUT-1819:Set the default Parallelism for Flink execution in

[smarthi] MAHOUT-1820:Add a method to generate Tuple<PartitionId, Partition

[smarthi] MAHOUT-1822:Update NOTICE.txt, License.txt to add Apache Flink

[smarthi] MAHOUT-1823: Modify MahoutFlinkTestSuite to implement FlinkTestBase,

[smarthi] NoJira: Upgrade to Flink 1.0.1

[smarthi] NoJira: Remove deprecated DataSetOps.scala

[apalumbo] MAHOUT-1824: Optimize FlinkOpAtA to use upper triangular matrices.

[smarthi] NoJira: Fix missing license header

[smarthi] MAHOUT-1826: Fix wikipedia example URLs, this closes apache/mahout#212

[akm] MAHOUT-1766: Increase default PermGen size for spark-shell

[akm] (nojira) Pulling newer readme into this branch

[smarthi] MAHOUT-1821: Use a mahout-flink-conf.yaml configuration file for 
Mahout

[smarthi] MAHOUT-1814:Implement drm2intKeyed in flink bindings, this closes

[smarthi] NoJira: Remove comment references to Spark and fix the Javadocs

[smarthi] MAHOUT-1828: Change the access of blas method in sparkbindings, this

[apalumbo] MAHOUT-1818 workaround and test cleanup for Flink release closes

[apalumbo] (noJira) copy test data into /examples/bin/resources dir so that it 
is

[smarthi] [maven-release-plugin] prepare release mahout-0.12.0

[smarthi] [maven-release-plugin] prepare for next development iteration

[smarthi] Rolling back 0.12.0 Release Candidate

[apalumbo] (nojira) fix tmp directory for cache. closes apache/mahout#220

[smarthi] MAHOUT-1829: Add Flink module to build tools, this closes

[smarthi] [maven-release-plugin] prepare release mahout-0.12.0

[smarthi] [maven-release-plugin] prepare for next development iteration

[smarthi] Rolling back 0.12.0 Release Candidate 1

------------------------------------------
[...truncated 82343 lines...]
[WARNING] 
<https://builds.apache.org/job/Mahout-Quality/ws/flink/src/test/scala/org/apache/mahout/flinkbindings/blas/LATestSuite.scala>:202:
 warning: Type Any has no fields that are visible from Scala Type analysis. 
Falling back to Java Type Analysis (TypeExtractor).
[WARNING]     val res = FlinkOpAtA.fat(op, Aany)
[WARNING]                                  ^
[WARNING] one warning found
[INFO] prepare-compile in 0 s
[INFO] compile in 17 s
[INFO] 
[INFO] --- maven-compiler-plugin:3.3:testCompile (default-testCompile) @ 
mahout-flink_2.10 ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-surefire-plugin:2.18.1:test (default-test) @ mahout-flink_2.10 
---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- scalatest-maven-plugin:1.0:test (test) @ mahout-flink_2.10 ---
Discovery starting.
Discovery completed in 733 milliseconds.
Run starting. Expected test count is: 105
DrmLikeOpsSuite:
Exception encountered when invoking run on a nested suite - requirement 
failed: MAHOUT_HOME is required to spawn mahout-based flink jobs *** ABORTED 
***
  java.lang.IllegalArgumentException: requirement failed: MAHOUT_HOME is 
required to spawn mahout-based flink jobs
  at scala.Predef$.require(Predef.scala:233)
  at 
org.apache.mahout.flinkbindings.package$.getMahoutHome(package.scala:111)
  at 
org.apache.mahout.flinkbindings.FlinkDistributedContext.<init>(FlinkDistributedContext.scala:28)
  at 
org.apache.mahout.flinkbindings.package$.wrapContext(package.scala:52)
  at 
org.apache.mahout.flinkbindings.DistributedFlinkSuite$class.initContext(DistributedFlinkSuite.scala:42)
  at 
org.apache.mahout.flinkbindings.DrmLikeOpsSuite.initContext(DrmLikeOpsSuite.scala:28)
  at 
org.apache.mahout.flinkbindings.DistributedFlinkSuite$class.beforeAll(DistributedFlinkSuite.scala:71)
  at 
org.apache.mahout.flinkbindings.DrmLikeOpsSuite.beforeAll(DrmLikeOpsSuite.scala:28)
  at 
org.scalatest.BeforeAndAfterAllConfigMap$class.run(BeforeAndAfterAllConfigMap.scala:244)
  at 
org.apache.mahout.flinkbindings.DrmLikeOpsSuite.run(DrmLikeOpsSuite.scala:28)
  ...
LATestSuite:
Exception encountered when invoking run on a nested suite - requirement 
failed: MAHOUT_HOME is required to spawn mahout-based flink jobs *** ABORTED 
***
  java.lang.IllegalArgumentException: requirement failed: MAHOUT_HOME is 
required to spawn mahout-based flink jobs
  at scala.Predef$.require(Predef.scala:233)
  at 
org.apache.mahout.flinkbindings.package$.getMahoutHome(package.scala:111)
  at 
org.apache.mahout.flinkbindings.FlinkDistributedContext.<init>(FlinkDistributedContext.scala:28)
  at 
org.apache.mahout.flinkbindings.package$.wrapContext(package.scala:52)
  at 
org.apache.mahout.flinkbindings.DistributedFlinkSuite$class.initContext(DistributedFlinkSuite.scala:42)
  at 
org.apache.mahout.flinkbindings.blas.LATestSuite.initContext(LATestSuite.scala:31)
  at 
org.apache.mahout.flinkbindings.DistributedFlinkSuite$class.beforeAll(DistributedFlinkSuite.scala:71)
  at 
org.apache.mahout.flinkbindings.blas.LATestSuite.beforeAll(LATestSuite.scala:31)
  at 
org.scalatest.BeforeAndAfterAllConfigMap$class.run(BeforeAndAfterAllConfigMap.scala:244)
  at 
org.apache.mahout.flinkbindings.blas.LATestSuite.run(LATestSuite.scala:31)
  ...
UseCasesSuite:
Exception encountered when invoking run on a nested suite - requirement 
failed: MAHOUT_HOME is required to spawn mahout-based flink jobs *** ABORTED 
***
  java.lang.IllegalArgumentException: requirement failed: MAHOUT_HOME is 
required to spawn mahout-based flink jobs
  at scala.Predef$.require(Predef.scala:233)
  at 
org.apache.mahout.flinkbindings.package$.getMahoutHome(package.scala:111)
  at 
org.apache.mahout.flinkbindings.FlinkDistributedContext.<init>(FlinkDistributedContext.scala:28)
  at 
org.apache.mahout.flinkbindings.package$.wrapContext(package.scala:52)
  at 
org.apache.mahout.flinkbindings.DistributedFlinkSuite$class.initContext(DistributedFlinkSuite.scala:42)
  at 
org.apache.mahout.flinkbindings.UseCasesSuite.initContext(UseCasesSuite.scala:32)
  at 
org.apache.mahout.flinkbindings.DistributedFlinkSuite$class.beforeAll(DistributedFlinkSuite.scala:71)
  at 
org.apache.mahout.flinkbindings.UseCasesSuite.beforeAll(UseCasesSuite.scala:32)
  at 
org.scalatest.BeforeAndAfterAllConfigMap$class.run(BeforeAndAfterAllConfigMap.scala:244)
  at 
org.apache.mahout.flinkbindings.UseCasesSuite.run(UseCasesSuite.scala:32)
  ...
FlinkDistributedDecompositionsSuite:
Exception encountered when invoking run on a nested suite - requirement 
failed: MAHOUT_HOME is required to spawn mahout-based flink jobs *** ABORTED 
***
  java.lang.IllegalArgumentException: requirement failed: MAHOUT_HOME is 
required to spawn mahout-based flink jobs
  at scala.Predef$.require(Predef.scala:233)
  at 
org.apache.mahout.flinkbindings.package$.getMahoutHome(package.scala:111)
  at 
org.apache.mahout.flinkbindings.FlinkDistributedContext.<init>(FlinkDistributedContext.scala:28)
  at 
org.apache.mahout.flinkbindings.package$.wrapContext(package.scala:52)
  at 
org.apache.mahout.flinkbindings.DistributedFlinkSuite$class.initContext(DistributedFlinkSuite.scala:42)
  at 
org.apache.mahout.flinkbindings.standard.FlinkDistributedDecompositionsSuite.initContext(FlinkDistributedDecompositionsSuite.scala:38)
  at 
org.apache.mahout.flinkbindings.DistributedFlinkSuite$class.beforeAll(DistributedFlinkSuite.scala:71)
  at 
org.apache.mahout.flinkbindings.standard.FlinkDistributedDecompositionsSuite.beforeAll(FlinkDistributedDecompositionsSuite.scala:38)
  at 
org.scalatest.BeforeAndAfterAllConfigMap$class.run(BeforeAndAfterAllConfigMap.scala:244)
  at 
org.apache.mahout.flinkbindings.standard.FlinkDistributedDecompositionsSuite.run(FlinkDistributedDecompositionsSuite.scala:38)
  ...
RLikeOpsSuite:
Exception encountered when invoking run on a nested suite - requirement 
failed: MAHOUT_HOME is required to spawn mahout-based flink jobs *** ABORTED 
***
  java.lang.IllegalArgumentException: requirement failed: MAHOUT_HOME is 
required to spawn mahout-based flink jobs
  at scala.Predef$.require(Predef.scala:233)
  at 
org.apache.mahout.flinkbindings.package$.getMahoutHome(package.scala:111)
  at 
org.apache.mahout.flinkbindings.FlinkDistributedContext.<init>(FlinkDistributedContext.scala:28)
  at 
org.apache.mahout.flinkbindings.package$.wrapContext(package.scala:52)
  at 
org.apache.mahout.flinkbindings.DistributedFlinkSuite$class.initContext(DistributedFlinkSuite.scala:42)
  at 
org.apache.mahout.flinkbindings.RLikeOpsSuite.initContext(RLikeOpsSuite.scala:34)
  at 
org.apache.mahout.flinkbindings.DistributedFlinkSuite$class.beforeAll(DistributedFlinkSuite.scala:71)
  at 
org.apache.mahout.flinkbindings.RLikeOpsSuite.beforeAll(RLikeOpsSuite.scala:34)
  at 
org.scalatest.BeforeAndAfterAllConfigMap$class.run(BeforeAndAfterAllConfigMap.scala:244)
  at 
org.apache.mahout.flinkbindings.RLikeOpsSuite.run(RLikeOpsSuite.scala:34)
  ...
FlinkByteBCastSuite:
- BCast vector
- BCast matrix
NaiveBayesTestSuite:
Exception encountered when invoking run on a nested suite - requirement 
failed: MAHOUT_HOME is required to spawn mahout-based flink jobs *** ABORTED 
***
  java.lang.IllegalArgumentException: requirement failed: MAHOUT_HOME is 
required to spawn mahout-based flink jobs
  at scala.Predef$.require(Predef.scala:233)
  at 
org.apache.mahout.flinkbindings.package$.getMahoutHome(package.scala:111)
  at 
org.apache.mahout.flinkbindings.FlinkDistributedContext.<init>(FlinkDistributedContext.scala:28)
  at 
org.apache.mahout.flinkbindings.package$.wrapContext(package.scala:52)
  at 
org.apache.mahout.flinkbindings.DistributedFlinkSuite$class.initContext(DistributedFlinkSuite.scala:42)
  at 
org.apache.mahout.flinkbindings.standard.NaiveBayesTestSuite.initContext(NaiveBayesTestSuite.scala:26)
  at 
org.apache.mahout.flinkbindings.DistributedFlinkSuite$class.beforeAll(DistributedFlinkSuite.scala:71)
  at 
org.apache.mahout.flinkbindings.standard.NaiveBayesTestSuite.beforeAll(NaiveBayesTestSuite.scala:26)
  at 
org.scalatest.BeforeAndAfterAllConfigMap$class.run(BeforeAndAfterAllConfigMap.scala:244)
  at 
org.apache.mahout.flinkbindings.standard.NaiveBayesTestSuite.run(NaiveBayesTestSuite.scala:26)
  ...
DrmLikeSuite:
Exception encountered when invoking run on a nested suite - requirement 
failed: MAHOUT_HOME is required to spawn mahout-based flink jobs *** ABORTED 
***
  java.lang.IllegalArgumentException: requirement failed: MAHOUT_HOME is 
required to spawn mahout-based flink jobs
  at scala.Predef$.require(Predef.scala:233)
  at 
org.apache.mahout.flinkbindings.package$.getMahoutHome(package.scala:111)
  at 
org.apache.mahout.flinkbindings.FlinkDistributedContext.<init>(FlinkDistributedContext.scala:28)
  at 
org.apache.mahout.flinkbindings.package$.wrapContext(package.scala:52)
  at 
org.apache.mahout.flinkbindings.DistributedFlinkSuite$class.initContext(DistributedFlinkSuite.scala:42)
  at 
org.apache.mahout.flinkbindings.standard.DrmLikeSuite.initContext(DrmLikeSuite.scala:25)
  at 
org.apache.mahout.flinkbindings.DistributedFlinkSuite$class.beforeAll(DistributedFlinkSuite.scala:71)
  at 
org.apache.mahout.flinkbindings.standard.DrmLikeSuite.beforeAll(DrmLikeSuite.scala:25)
  at 
org.scalatest.BeforeAndAfterAllConfigMap$class.run(BeforeAndAfterAllConfigMap.scala:244)
  at 
org.apache.mahout.flinkbindings.standard.DrmLikeSuite.run(DrmLikeSuite.scala:25)
  ...
RLikeDrmOpsSuite:
Exception encountered when invoking run on a nested suite - requirement 
failed: MAHOUT_HOME is required to spawn mahout-based flink jobs *** ABORTED 
***
  java.lang.IllegalArgumentException: requirement failed: MAHOUT_HOME is 
required to spawn mahout-based flink jobs
  at scala.Predef$.require(Predef.scala:233)
  at 
org.apache.mahout.flinkbindings.package$.getMahoutHome(package.scala:111)
  at 
org.apache.mahout.flinkbindings.FlinkDistributedContext.<init>(FlinkDistributedContext.scala:28)
  at 
org.apache.mahout.flinkbindings.package$.wrapContext(package.scala:52)
  at 
org.apache.mahout.flinkbindings.DistributedFlinkSuite$class.initContext(DistributedFlinkSuite.scala:42)
  at 
org.apache.mahout.flinkbindings.standard.RLikeDrmOpsSuite.initContext(RLikeDrmOpsSuite.scala:25)
  at 
org.apache.mahout.flinkbindings.DistributedFlinkSuite$class.beforeAll(DistributedFlinkSuite.scala:71)
  at 
org.apache.mahout.flinkbindings.standard.RLikeDrmOpsSuite.beforeAll(RLikeDrmOpsSuite.scala:25)
  at 
org.scalatest.BeforeAndAfterAllConfigMap$class.run(BeforeAndAfterAllConfigMap.scala:244)
  at 
org.apache.mahout.flinkbindings.standard.RLikeDrmOpsSuite.run(RLikeDrmOpsSuite.scala:25)
  ...
DrmLikeOpsSuite:
Exception encountered when invoking run on a nested suite - requirement 
failed: MAHOUT_HOME is required to spawn mahout-based flink jobs *** ABORTED 
***
  java.lang.IllegalArgumentException: requirement failed: MAHOUT_HOME is 
required to spawn mahout-based flink jobs
  at scala.Predef$.require(Predef.scala:233)
  at 
org.apache.mahout.flinkbindings.package$.getMahoutHome(package.scala:111)
  at 
org.apache.mahout.flinkbindings.FlinkDistributedContext.<init>(FlinkDistributedContext.scala:28)
  at 
org.apache.mahout.flinkbindings.package$.wrapContext(package.scala:52)
  at 
org.apache.mahout.flinkbindings.DistributedFlinkSuite$class.initContext(DistributedFlinkSuite.scala:42)
  at 
org.apache.mahout.flinkbindings.standard.DrmLikeOpsSuite.initContext(DrmLikeOpsSuite.scala:25)
  at 
org.apache.mahout.flinkbindings.DistributedFlinkSuite$class.beforeAll(DistributedFlinkSuite.scala:71)
  at 
org.apache.mahout.flinkbindings.standard.DrmLikeOpsSuite.beforeAll(DrmLikeOpsSuite.scala:25)
  at 
org.scalatest.BeforeAndAfterAllConfigMap$class.run(BeforeAndAfterAllConfigMap.scala:244)
  at 
org.apache.mahout.flinkbindings.standard.DrmLikeOpsSuite.run(DrmLikeOpsSuite.scala:25)
  ...
Run completed in 3 seconds, 443 milliseconds.
Total number of tests run: 2
Suites: completed 2, aborted 9
Tests: succeeded 2, failed 0, canceled 0, ignored 0, pending 0
*** 9 SUITES ABORTED ***
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Mahout
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Mahout Build Tools
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Mahout Build Tools ................................. SUCCESS [ 11.338 s]
[INFO] Apache Mahout ...................................... SUCCESS [  0.334 s]
[INFO] Mahout Math ........................................ SUCCESS [01:16 min]
[INFO] Mahout HDFS ........................................ SUCCESS [  9.067 s]
[INFO] Mahout Map-Reduce .................................. SUCCESS [12:03 min]
[INFO] Mahout Integration ................................. SUCCESS [ 54.259 s]
[INFO] Mahout Examples .................................... SUCCESS [ 21.779 s]
[INFO] Mahout Math Scala bindings ......................... SUCCESS [04:18 min]
[INFO] Mahout H2O backend ................................. SUCCESS [03:23 min]
[INFO] Mahout Spark bindings .............................. SUCCESS [02:16 min]
[INFO] Mahout Flink bindings .............................. FAILURE [ 53.221 s]
[INFO] Mahout Spark bindings shell ........................ SKIPPED
[INFO] Mahout Release Package ............................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 25:52 min
[INFO] Finished at: 2016-04-11T08:40:42+00:00
[INFO] Final Memory: 70M/429M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.scalatest:scalatest-maven-plugin:1.0:test 
(test) on project mahout-flink_2.10: There are test failures -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :mahout-flink_2.10
Build step 'Invoke top-level Maven targets' marked build as failure
[PMD] Skipping publisher since build result is FAILURE
[TASKS] Skipping publisher since build result is FAILURE
Archiving artifacts
Compressed 169.75 MB of artifacts by 88.4% relative to #3323
Recording test results
Publishing Javadoc
Updating MAHOUT-1749
Updating MAHOUT-1826
Updating MAHOUT-1747
Updating MAHOUT-1766
Updating MAHOUT-1824
Updating MAHOUT-1823
Updating MAHOUT-1822
Updating MAHOUT-1821
Updating MAHOUT-1820
Updating MAHOUT-1703
Updating MAHOUT-1702
Updating MAHOUT-1764
Updating MAHOUT-1701
Updating MAHOUT-1570
Updating MAHOUT-1640
Updating MAHOUT-1793
Updating MAHOUT-1800
Updating MAHOUT-1801
Updating MAHOUT-1819
Updating MAHOUT-1797
Updating MAHOUT-1802
Updating MAHOUT-1817
Updating MAHOUT-1818
Updating MAHOUT-1794
Updating MAHOUT-1805
Updating MAHOUT-1809
Updating MAHOUT-1776
Updating MAHOUT-1808
Updating MAHOUT-1807
Updating MAHOUT-1751
Updating MAHOUT-1734
Updating MAHOUT-1711
Updating MAHOUT-1773
Updating MAHOUT-1710
Updating MAHOUT-1712
Updating MAHOUT-1775
Updating MAHOUT-1709
Updating MAHOUT-1811
Updating MAHOUT-1812
Updating MAHOUT-1810
Updating MAHOUT-1781
Updating MAHOUT-1828
Updating MAHOUT-1815
Updating MAHOUT-1829
Updating MAHOUT-1816
Updating MAHOUT-1813
Updating MAHOUT-1814
Updating MAHOUT-1785

Reply via email to