See <https://builds.apache.org/job/Mahout-Quality/3013/changes>

Changes:

[sslavic] MAHOUT-1590 Fixed class path issues, hbase client in integration 
module was brining in hadoop 2.2 dependencies regardless of hadoop.version used 
in build

------------------------------------------
[...truncated 59413 lines...]
6047/6375 KB   
6051/6375 KB   
6055/6375 KB   
6059/6375 KB   
6063/6375 KB   
6067/6375 KB   
6071/6375 KB   
6075/6375 KB   
6079/6375 KB   
6083/6375 KB   
6087/6375 KB   
6091/6375 KB   
6095/6375 KB   
6099/6375 KB   
6103/6375 KB   
6107/6375 KB   
6111/6375 KB   
6115/6375 KB   
6119/6375 KB   
6123/6375 KB   
6127/6375 KB   
6131/6375 KB   
6135/6375 KB   
6139/6375 KB   
6143/6375 KB   
6147/6375 KB   
6151/6375 KB   
6155/6375 KB   
6159/6375 KB   
6163/6375 KB   
6167/6375 KB   
6171/6375 KB   
6175/6375 KB   
6179/6375 KB   
6183/6375 KB   
6187/6375 KB   
6191/6375 KB   
6195/6375 KB   
6199/6375 KB   
6203/6375 KB   
6207/6375 KB   
6211/6375 KB   
6215/6375 KB   
6219/6375 KB   
6223/6375 KB   
6227/6375 KB   
6231/6375 KB   
6235/6375 KB   
6239/6375 KB   
6243/6375 KB   
6247/6375 KB   
6251/6375 KB   
6255/6375 KB   
6259/6375 KB   
6263/6375 KB   
6267/6375 KB   
6271/6375 KB   
6275/6375 KB   
6279/6375 KB   
6283/6375 KB   
6287/6375 KB   
6291/6375 KB   
6295/6375 KB   
6299/6375 KB   
6303/6375 KB   
6307/6375 KB   
6311/6375 KB   
6315/6375 KB   
6319/6375 KB   
6323/6375 KB   
6327/6375 KB   
6331/6375 KB   
6335/6375 KB   
6339/6375 KB   
6343/6375 KB   
6347/6375 KB   
6351/6375 KB   
6355/6375 KB   
6359/6375 KB   
6363/6375 KB   
6367/6375 KB   
6371/6375 KB   
6375/6375 KB   
               
Downloaded: 
http://repo.maven.apache.org/maven2/org/apache/spark/spark-core_2.10/1.1.1/spark-core_2.10-1.1.1.jar
 (6375 KB at 14754.6 KB/sec)
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ mahout-spark_2.10 
---
[INFO] 
[INFO] --- build-helper-maven-plugin:1.8:add-source (add-source) @ 
mahout-spark_2.10 ---
[INFO] Source directory: 
<https://builds.apache.org/job/Mahout-Quality/ws/spark/target/generated-sources/mahout>
 added.
[INFO] 
[INFO] --- build-helper-maven-plugin:1.8:add-test-source (add-test-source) @ 
mahout-spark_2.10 ---
[INFO] Test Source directory: 
<https://builds.apache.org/job/Mahout-Quality/ws/spark/target/generated-test-sources/mahout>
 added.
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ 
mahout-spark_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory 
<https://builds.apache.org/job/Mahout-Quality/ws/spark/src/main/resources>
[INFO] 
[INFO] --- maven-compiler-plugin:3.2:compile (default-compile) @ 
mahout-spark_2.10 ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-scala-plugin:2.15.2:compile (default) @ mahout-spark_2.10 ---
[INFO] Checking for multiple versions of scala
[WARNING]  Expected all dependencies to require Scala version: 2.10.4
[WARNING]  com.twitter:chill_2.10:0.3.6 requires scala version: 2.10.3
[WARNING] Multiple versions of scala libraries detected!
[INFO] includes = [**/*.scala,**/*.java,]
[INFO] excludes = []
[INFO] 
<https://builds.apache.org/job/Mahout-Quality/ws/spark/src/main/scala>:-1: 
info: compiling
[INFO] Compiling 37 source files to 
<https://builds.apache.org/job/Mahout-Quality/3013/artifact/spark/target/classes>
 at 1427371890236
[WARNING] 
<https://builds.apache.org/job/Mahout-Quality/ws/spark/src/main/scala/org/apache/mahout/drivers/TextDelimitedReaderWriter.scala>:162:
 warning: a pure expression does nothing in statement position; you may be 
omitting necessary parentheses
[INFO]       val columnIDs = interactions.flatMap { case (_, columns) => columns
[INFO]                                                                   ^
[WARNING] warning: there were 5 deprecation warning(s); re-run with 
-deprecation for details
[WARNING] warning: there were 16 feature warning(s); re-run with -feature for 
details
[WARNING] three warnings found
[INFO] prepare-compile in 0 s
[INFO] compile in 22 s
[INFO] 
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ 
mahout-spark_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory 
<https://builds.apache.org/job/Mahout-Quality/ws/spark/src/test/resources>
[INFO] 
[INFO] --- maven-compiler-plugin:3.2:testCompile (default-testCompile) @ 
mahout-spark_2.10 ---
[INFO] No sources to compile
[INFO] 
[INFO] --- maven-scala-plugin:2.15.2:testCompile (default) @ mahout-spark_2.10 
---
[INFO] Checking for multiple versions of scala
[WARNING]  Expected all dependencies to require Scala version: 2.10.4
[WARNING]  com.twitter:chill_2.10:0.3.6 requires scala version: 2.10.3
[WARNING] Multiple versions of scala libraries detected!
[INFO] includes = [**/*.scala,**/*.java,]
[INFO] excludes = []
[INFO] 
<https://builds.apache.org/job/Mahout-Quality/ws/spark/src/test/scala>:-1: 
info: compiling
[INFO] Compiling 14 source files to 
<https://builds.apache.org/job/Mahout-Quality/3013/artifact/spark/target/test-classes>
 at 1427371912566
[INFO] prepare-compile in 0 s
[INFO] compile in 19 s
[INFO] 
[INFO] --- maven-surefire-plugin:2.16:test (default-test) @ mahout-spark_2.10 
---
[INFO] 
[INFO] --- scalatest-maven-plugin:1.0-M2:test (test) @ mahout-spark_2.10 ---
WARNING: -p has been deprecated and will be reused for a different (but still 
very cool) purpose in ScalaTest 2.0. Please change all uses of -p to -R.
Discovery starting.
Discovery completed in 601 milliseconds.
Run starting. Expected test count is: 92
DiscoverySuite:
RowSimilarityDriverSuite:
10 [SparkListenerBus] ERROR org.apache.spark.util.Utils  - Uncaught exception 
in thread SparkListenerBus
java.lang.NoSuchMethodError: 
com.google.common.hash.HashFunction.hashInt(I)Lcom/google/common/hash/HashCode;
        at 
org.apache.spark.util.collection.OpenHashSet.org$apache$spark$util$collection$OpenHashSet$$hashcode(OpenHashSet.scala:261)
        at 
org.apache.spark.util.collection.OpenHashSet$mcI$sp.addWithoutResize$mcI$sp(OpenHashSet.scala:124)
        at 
org.apache.spark.util.collection.OpenHashSet$mcI$sp.add$mcI$sp(OpenHashSet.scala:109)
        at 
org.apache.spark.ui.jobs.JobProgressListener.onTaskEnd(JobProgressListener.scala:164)
        at 
org.apache.spark.scheduler.SparkListenerBus$$anonfun$postToAll$7.apply(SparkListenerBus.scala:58)
        at 
org.apache.spark.scheduler.SparkListenerBus$$anonfun$postToAll$7.apply(SparkListenerBus.scala:58)
        at 
org.apache.spark.scheduler.SparkListenerBus$$anonfun$foreachListener$1.apply(SparkListenerBus.scala:83)
        at 
org.apache.spark.scheduler.SparkListenerBus$$anonfun$foreachListener$1.apply(SparkListenerBus.scala:81)
        at 
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
        at 
org.apache.spark.scheduler.SparkListenerBus$class.foreachListener(SparkListenerBus.scala:81)
        at 
org.apache.spark.scheduler.SparkListenerBus$class.postToAll(SparkListenerBus.scala:58)
        at 
org.apache.spark.scheduler.LiveListenerBus.postToAll(LiveListenerBus.scala:32)
        at 
org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:56)
        at 
org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:56)
        at scala.Option.foreach(Option.scala:236)
        at 
org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(LiveListenerBus.scala:56)
        at 
org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)
        at 
org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)
        at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1364)
        at 
org.apache.spark.scheduler.LiveListenerBus$$anon$1.run(LiveListenerBus.scala:46)
Exception in thread "SparkListenerBus" java.lang.NoSuchMethodError: 
com.google.common.hash.HashFunction.hashInt(I)Lcom/google/common/hash/HashCode;
        at 
org.apache.spark.util.collection.OpenHashSet.org$apache$spark$util$collection$OpenHashSet$$hashcode(OpenHashSet.scala:261)
        at 
org.apache.spark.util.collection.OpenHashSet$mcI$sp.addWithoutResize$mcI$sp(OpenHashSet.scala:124)
        at 
org.apache.spark.util.collection.OpenHashSet$mcI$sp.add$mcI$sp(OpenHashSet.scala:109)
        at 
org.apache.spark.ui.jobs.JobProgressListener.onTaskEnd(JobProgressListener.scala:164)
        at 
org.apache.spark.scheduler.SparkListenerBus$$anonfun$postToAll$7.apply(SparkListenerBus.scala:58)
        at 
org.apache.spark.scheduler.SparkListenerBus$$anonfun$postToAll$7.apply(SparkListenerBus.scala:58)
        at 
org.apache.spark.scheduler.SparkListenerBus$$anonfun$foreachListener$1.apply(SparkListenerBus.scala:83)
        at 
org.apache.spark.scheduler.SparkListenerBus$$anonfun$foreachListener$1.apply(SparkListenerBus.scala:81)
        at 
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
        at 
org.apache.spark.scheduler.SparkListenerBus$class.foreachListener(SparkListenerBus.scala:81)
        at 
org.apache.spark.scheduler.SparkListenerBus$class.postToAll(SparkListenerBus.scala:58)
        at 
org.apache.spark.scheduler.LiveListenerBus.postToAll(LiveListenerBus.scala:32)
        at 
org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:56)
        at 
org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:56)
        at scala.Option.foreach(Option.scala:236)
        at 
org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(LiveListenerBus.scala:56)
        at 
org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)
        at 
org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)
        at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1364)
        at 
org.apache.spark.scheduler.LiveListenerBus$$anon$1.run(LiveListenerBus.scala:46)
*** RUN ABORTED ***
  java.lang.NoSuchMethodError: 
com.google.common.hash.HashFunction.hashInt(I)Lcom/google/common/hash/HashCode;
  at 
org.apache.spark.util.collection.OpenHashSet.org$apache$spark$util$collection$OpenHashSet$$hashcode(OpenHashSet.scala:261)
  at 
org.apache.spark.util.collection.OpenHashSet$mcI$sp.getPos$mcI$sp(OpenHashSet.scala:165)
  at 
org.apache.spark.util.collection.OpenHashSet$mcI$sp.contains$mcI$sp(OpenHashSet.scala:102)
  at 
org.apache.spark.util.SizeEstimator$$anonfun$visitArray$2.apply$mcVI$sp(SizeEstimator.scala:214)
  at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
  at 
org.apache.spark.util.SizeEstimator$.visitArray(SizeEstimator.scala:210)
  at 
org.apache.spark.util.SizeEstimator$.visitSingleObject(SizeEstimator.scala:169)
  at 
org.apache.spark.util.SizeEstimator$.org$apache$spark$util$SizeEstimator$$estimate(SizeEstimator.scala:161)
  at 
org.apache.spark.util.SizeEstimator$.estimate(SizeEstimator.scala:155)
  at 
org.apache.spark.util.collection.SizeTracker$class.takeSample(SizeTracker.scala:78)
  ...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Mahout Build Tools ................................ SUCCESS [9.466s]
[INFO] Apache Mahout ..................................... SUCCESS [5.625s]
[INFO] Mahout Math ....................................... SUCCESS [2:18.601s]
[INFO] Mahout MapReduce Legacy ........................... SUCCESS [15:25.198s]
[INFO] Mahout Integration ................................ SUCCESS [1:39.650s]
[INFO] Mahout Examples ................................... SUCCESS [55.838s]
[INFO] Mahout Release Package ............................ SUCCESS [0.104s]
[INFO] Mahout Math Scala bindings ........................ SUCCESS [2:13.649s]
[INFO] Mahout Spark bindings ............................. FAILURE [54.411s]
[INFO] Mahout Spark bindings shell ....................... SKIPPED
[INFO] Mahout H2O backend ................................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 23:46.870s
[INFO] Finished at: Thu Mar 26 12:12:17 UTC 2015
[INFO] Final Memory: 90M/424M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.scalatest:scalatest-maven-plugin:1.0-M2:test 
(test) on project mahout-spark_2.10: There are test failures -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :mahout-spark_2.10
Build step 'Invoke top-level Maven targets' marked build as failure
[PMD] Skipping publisher since build result is FAILURE
[TASKS] Skipping publisher since build result is FAILURE
Archiving artifacts
Sending artifact delta relative to Mahout-Quality #3004
Archived 52 artifacts
Archive block size is 32768
Received 1144 blocks and 80834554 bytes
Compression is 31.7%
Took 28 sec
Recording test results
Publishing Javadoc
Updating MAHOUT-1590

Reply via email to