You are receiving this mail as a port that you maintain
is failing to build on the FreeBSD package build server.
Please investigate the failure and submit a PR to fix
build.

Maintainer:     de...@freebsd.org
Last committer: de...@freebsd.org
Ident:          $FreeBSD: branches/2015Q1/devel/spark/Makefile 375075 
2014-12-20 18:26:31Z demon $
Log URL:        
http://beefy1.isc.freebsd.org/data/93i386-quarterly/2015-03-11_06h33m23s/logs/apache-spark-1.2.0.log
Build URL:      
http://beefy1.isc.freebsd.org/build.html?mastername=93i386-quarterly&build=2015-03-11_06h33m23s
Log:

====>> Building devel/spark
build started at Wed Mar 11 06:52:26 UTC 2015
port directory: /usr/ports/devel/spark
building for: FreeBSD 93i386-quarterly-job-12 9.3-RELEASE-p10 FreeBSD 
9.3-RELEASE-p10 i386
maintained by: de...@freebsd.org
Makefile ident:      $FreeBSD: branches/2015Q1/devel/spark/Makefile 375075 
2014-12-20 18:26:31Z demon $
Poudriere version: 3.1.1
Host OSVERSION: 1100060
Jail OSVERSION: 903000

---Begin Environment---
UNAME_m=i386
UNAME_p=i386
OSVERSION=903000
UNAME_v=FreeBSD 9.3-RELEASE-p10
UNAME_r=9.3-RELEASE-p10
BLOCKSIZE=K
MAIL=/var/mail/root
STATUS=1
SAVED_TERM=
MASTERMNT=/usr/local/poudriere/data/.m/93i386-quarterly/ref
PATH=/sbin:/bin:/usr/sbin:/usr/bin:/usr/games:/usr/local/sbin:/usr/local/bin:/root/bin
POUDRIERE_BUILD_TYPE=bulk
PKGNAME=apache-spark-1.2.0
OLDPWD=/
PWD=/usr/local/poudriere/data/.m/93i386-quarterly/ref/.p/pool
MASTERNAME=93i386-quarterly
USER=root
HOME=/root
POUDRIERE_VERSION=3.1.1
LOCALBASE=/usr/local
PACKAGE_BUILDING=yes
---End Environment---

---Begin OPTIONS List---
---End OPTIONS List---

--CONFIGURE_ARGS--

--End CONFIGURE_ARGS--

--CONFIGURE_ENV--
XDG_DATA_HOME=/wrkdirs/usr/ports/devel/spark/work  
XDG_CONFIG_HOME=/wrkdirs/usr/ports/devel/spark/work  
HOME=/wrkdirs/usr/ports/devel/spark/work TMPDIR="/tmp" 
PYTHON="/usr/local/bin/python2.7" SHELL=/bin/sh CONFIG_SHELL=/bin/sh
--End CONFIGURE_ENV--

--MAKE_ENV--
MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m" 
XDG_DATA_HOME=/wrkdirs/usr/ports/devel/spark/work  
XDG_CONFIG_HOME=/wrkdirs/usr/ports/devel/spark/work  
HOME=/wrkdirs/usr/ports/devel/spark/work TMPDIR="/tmp" NO_PIE=yes SHELL=/bin/sh 
NO_LINT=YES PREFIX=/usr/local  LOCALBASE=/usr/local  LIBDIR="/usr/lib"  CC="cc" 
CFLAGS="-O2 -pipe -fno-strict-aliasing"  CPP="cpp" CPPFLAGS=""  LDFLAGS="" 
LIBS=""  CXX="c++" CXXFLAGS="-O2 -pipe -fno-strict-aliasing"  
MANPREFIX="/usr/local" BSD_INSTALL_PROGRAM="install  -s -o root -g wheel -m 
555"  BSD_INSTALL_LIB="install  -s -o root -g wheel -m 444"  
BSD_INSTALL_SCRIPT="install  -o root -g wheel -m 555"  
BSD_INSTALL_DATA="install  -o root -g wheel -m 0644"  BSD_INSTALL_MAN="install  
-o root -g wheel -m 444"
--End MAKE_ENV--

--PLIST_SUB--
SPARK_USER=spark
SPARK_GROUP=spark
VER=1.2.0
JAVASHAREDIR="share/java"
JAVAJARDIR="share/java/classes"
PYTHON_INCLUDEDIR=include/python2.7
PYTHON_LIBDIR=lib/python2.7
PYTHON_PLATFORM=freebsd9
PYTHON_SITELIBDIR=lib/python2.7/site-packages
PYTHON_VERSION=python2.7
PYTHON_VER=2.7
OSREL=9.3
PREFIX=%D
LOCALBASE=/usr/local
RESETPREFIX=/usr/local
PORTDOCS=""
PORTEXAMPLES=""
LIB32DIR=lib
DOCSDIR="share/doc/spark"
EXAMPLESDIR="share/examples/spark"
DATADIR="share/spark"
WWWDIR="www/spark"
ETCDIR="etc/spark"
--End PLIST_SUB--

--SUB_LIST--
SPARK_USER=spark
SPARK_GROUP=spark
JAVASHAREDIR="/usr/local/share/java"
JAVAJARDIR="/usr/local/share/java/classes"
JAVALIBDIR="/usr/local/share/java/classes"
JAVA_VERSION="1.7+"
PREFIX=/usr/local
LOCALBASE=/usr/local
DATADIR=/usr/local/share/spark
DOCSDIR=/usr/local/share/doc/spark
EXAMPLESDIR=/usr/local/share/examples/spark
WWWDIR=/usr/local/www/spark
ETCDIR=/usr/local/etc/spark
--End SUB_LIST--

---Begin make.conf---
ARCH=i386
MACHINE=i386
MACHINE_ARCH=i386
USE_PACKAGE_DEPENDS=yes
BATCH=yes
WRKDIRPREFIX=/wrkdirs
PORTSDIR=/usr/ports
PACKAGES=/packages
DISTDIR=/distfiles
#### /usr/local/etc/poudriere.d/make.conf ####
WITH_PKGNG=yes
#WITH_PKGNG=devel
# clean-restricted via poudriere.conf NO_RESTRICTED
#NO_RESTRICTED=yes
DISABLE_MAKE_JOBS=poudriere
---End make.conf---
=======================<phase: check-sanity   >============================
===>  License APACHE20 accepted by the user
===========================================================================
=======================<phase: pkg-depends    >============================
===>   apache-spark-1.2.0 depends on file: /usr/local/sbin/pkg - not found
===>    Verifying install for /usr/local/sbin/pkg in /usr/ports/ports-mgmt/pkg
===>   Installing existing package /packages/All/pkg-1.4.3.txz
[93i386-quarterly-job-12] Installing pkg-1.4.3...
[93i386-quarterly-job-12] Extracting pkg-1.4.3... done
Message for pkg-1.4.3:
 If you are upgrading from the old package format, first run:

  # pkg2ng
===>   Returning to build of apache-spark-1.2.0
===========================================================================
=======================<phase: fetch-depends  >============================
===========================================================================
=======================<phase: fetch          >============================
===>  License APACHE20 accepted by the user
===> Fetching all distfiles required by apache-spark-1.2.0 for building
===========================================================================
=======================<phase: checksum       >============================
===>  License APACHE20 accepted by the user
===> Fetching all distfiles required by apache-spark-1.2.0 for building
=> SHA256 Checksum OK for hadoop/spark-1.2.0.tgz.
=> SHA256 Checksum OK for hadoop/FreeBSD-spark-1.2.0-maven-repository.tar.gz.
===========================================================================
=======================<phase: extract-depends>============================
===========================================================================
=======================<phase: extract        >============================
===>  License APACHE20 accepted by the user
===> Fetching all distfiles required by apache-spark-1.2.0 for building
===>  Extracting for apache-spark-1.2.0
=> SHA256 Checksum OK for hadoop/spark-1.2.0.tgz.
=> SHA256 Checksum OK for hadoop/FreeBSD-spark-1.2.0-maven-repository.tar.gz.
===========================================================================
=======================<phase: patch-depends  >============================
===========================================================================
=======================<phase: patch          >============================
===>  Patching for apache-spark-1.2.0
===>  Applying FreeBSD patches for apache-spark-1.2.0
===========================================================================
=======================<phase: build-depends  >============================
===>   apache-spark-1.2.0 depends on file: /usr/local/share/java/maven3/bin/mvn 
- not found
===>    Verifying install for /usr/local/share/java/maven3/bin/mvn in 
/usr/ports/devel/maven3
===>   Installing existing package /packages/All/maven3-3.0.5.txz
[93i386-quarterly-job-12] Installing maven3-3.0.5...
[93i386-quarterly-job-12] `-- Installing maven-wrapper-1_2...
[93i386-quarterly-job-12] `-- Extracting maven-wrapper-1_2... done
[93i386-quarterly-job-12] `-- Installing openjdk-7.71.14_1,1...
[93i386-quarterly-job-12] |   `-- Installing alsa-lib-1.0.28...
[93i386-quarterly-job-12] |   `-- Extracting alsa-lib-1.0.28... done
[93i386-quarterly-job-12] |   `-- Installing dejavu-2.34_4...
[93i386-quarterly-job-12] |   `-- Extracting dejavu-2.34_4... done
[93i386-quarterly-job-12] |   `-- Installing fontconfig-2.11.1,1...
[93i386-quarterly-job-12] |   | `-- Installing expat-2.1.0_2...
[93i386-quarterly-job-12] |   | `-- Extracting expat-2.1.0_2... done
[93i386-quarterly-job-12] |   | `-- Installing freetype2-2.5.4_1...
[93i386-quarterly-job-12] |   | `-- Extracting freetype2-2.5.4_1... done
[93i386-quarterly-job-12] |   `-- Extracting fontconfig-2.11.1,1... done
Running fc-cache to build fontconfig cache...
/usr/local/share/fonts: skipping, no such directory
/usr/local/lib/X11/fonts: caching, new cache contents: 0 fonts, 1 dirs
/usr/local/lib/X11/fonts/dejavu: caching, new cache contents: 21 fonts, 0 dirs
/root/.local/share/fonts: skipping, no such directory
/root/.fonts: skipping, no such directory
Re-scanning /usr/local/lib/X11/fonts: caching, new cache contents: 0 fonts, 1 
dirs
/var/db/fontconfig: cleaning cache directory
/root/.cache/fontconfig: not cleaning non-existent cache directory
/root/.fontconfig: not cleaning non-existent cache directory
fc-cache: succeeded
[93i386-quarterly-job-12] |   `-- Installing java-zoneinfo-2014.j...
[93i386-quarterly-job-12] |   `-- Extracting java-zoneinfo-2014.j... done
[93i386-quarterly-job-12] |   `-- Installing javavmwrapper-2.5...
[93i386-quarterly-job-12] |   `-- Extracting javavmwrapper-2.5... done
[93i386-quarterly-job-12] |   `-- Installing libX11-1.6.2_2,1...
[93i386-quarterly-job-12] |   | `-- Installing kbproto-1.0.6...
[93i386-quarterly-job-12] |   | `-- Extracting kbproto-1.0.6... done
[93i386-quarterly-job-12] |   | `-- Installing libXau-1.0.8_2...
[93i386-quarterly-job-12] |   |   `-- Installing xproto-7.0.26...
[93i386-quarterly-job-12] |   |   `-- Extracting xproto-7.0.26... done
[93i386-quarterly-job-12] |   | `-- Extracting libXau-1.0.8_2... done
[93i386-quarterly-job-12] |   | `-- Installing libXdmcp-1.1.1_2...
[93i386-quarterly-job-12] |   | `-- Extracting libXdmcp-1.1.1_2... done
[93i386-quarterly-job-12] |   | `-- Installing libxcb-1.11...
[93i386-quarterly-job-12] |   |   `-- Installing libpthread-stubs-0.3_6...
[93i386-quarterly-job-12] |   |   `-- Extracting libpthread-stubs-0.3_6... done
[93i386-quarterly-job-12] |   |   `-- Installing libxml2-2.9.2_2...
[93i386-quarterly-job-12] |   |   | `-- Installing libiconv-1.14_6...
[93i386-quarterly-job-12] |   |   | `-- Extracting libiconv-1.14_6... done
[93i386-quarterly-job-12] |   |   `-- Extracting libxml2-2.9.2_2... done
<snip>
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ spark-core_2.10 ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (enforce-versions) @ 
spark-core_2.10 ---
[INFO] 
[INFO] --- build-helper-maven-plugin:1.8:add-source (add-scala-sources) @ 
spark-core_2.10 ---
[INFO] Source directory: 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/src/main/scala added.
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ 
spark-core_2.10 ---
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (default) @ spark-core_2.10 ---
[WARNING] Parameter tasks is deprecated, use target instead
[INFO] Executing tasks

main:
    [unzip] Expanding: 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/python/lib/py4j-0.8.2.1-src.zip 
into /wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/python/build
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ 
spark-core_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 11 resources
[INFO] Copying 23 resources
[INFO] Copying 7 resources
[INFO] Copying 3 resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.0:compile (scala-compile-first) @ 
spark-core_2.10 ---
[WARNING] Zinc server is not available at port 3030 - reverting to normal 
incremental compile
[INFO] Using incremental compilation
[INFO] compiler plugin: 
BasicArtifact(org.scalamacros,paradise_2.10.4,2.0.1,null)
[INFO] Compiling 402 Scala sources and 33 Java sources to 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/target/scala-2.10/classes...
[WARNING] 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/src/main/scala/org/apache/spark/scheduler/TaskResultGetter.scala:50:
 inferred existential type (org.apache.spark.scheduler.DirectTaskResult[_$1], 
Int) forSome { type _$1 }, which cannot be expressed by wildcards,  should be 
enabled
by making the implicit value scala.language.existentials visible.
This can be achieved by adding the import clause 'import 
scala.language.existentials'
or by setting the compiler option -language:existentials.
See the Scala docs for value scala.language.existentials for a discussion
why the feature should be explicitly enabled.
[WARNING]           val (result, size) = 
serializer.get().deserialize[TaskResult[_]](serializedData) match {
[WARNING]               ^
[WARNING] 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/src/main/scala/org/apache/spark/SparkContext.scala:573:
 constructor Job in class Job is deprecated: see corresponding Javadoc for more 
information.
[WARNING]     val job = new NewHadoopJob(hadoopConfiguration)
[WARNING]               ^
[WARNING] 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/src/main/scala/org/apache/spark/SparkContext.scala:618:
 constructor Job in class Job is deprecated: see corresponding Javadoc for more 
information.
[WARNING]     val job = new NewHadoopJob(hadoopConfiguration)
[WARNING]               ^
[WARNING] 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/src/main/scala/org/apache/spark/SparkContext.scala:773:
 constructor Job in class Job is deprecated: see corresponding Javadoc for more 
information.
[WARNING]     val job = new NewHadoopJob(conf)
[WARNING]               ^
[WARNING] 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/src/main/scala/org/apache/spark/SparkHadoopWriter.scala:168:
 constructor TaskID in class TaskID is deprecated: see corresponding Javadoc 
for more information.
[WARNING]         new TaskAttemptID(new TaskID(jID.value, true, splitID), 
attemptID))
[WARNING]                           ^
[WARNING] 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/src/main/scala/org/apache/spark/SparkHadoopWriter.scala:189:
 method makeQualified in class Path is deprecated: see corresponding Javadoc 
for more information.
[WARNING]     outputPath.makeQualified(fs)
[WARNING]                ^
[WARNING] 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala:103:
 method isDir in class FileStatus is deprecated: see corresponding Javadoc for 
more information.
[WARNING]     if (!fs.getFileStatus(path).isDir) {
[WARNING]                                 ^
[WARNING] 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala:157:
 method isDir in class FileStatus is deprecated: see corresponding Javadoc for 
more information.
[WARNING]       val logDirs = if (logStatus != null) 
logStatus.filter(_.isDir).toSeq else Seq[FileStatus]()
[WARNING]                                                               ^
[WARNING] 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/src/main/scala/org/apache/spark/input/PortableDataStream.scala:48:
 method isDir in class FileStatus is deprecated: see corresponding Javadoc for 
more information.
[WARNING]       if (file.isDir) 0L else file.getLen
[WARNING]                ^
[WARNING] 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/src/main/scala/org/apache/spark/input/WholeTextFileInputFormat.scala:63:
 method isDir in class FileStatus is deprecated: see corresponding Javadoc for 
more information.
[WARNING]       if (file.isDir) 0L else file.getLen
[WARNING]                ^
[WARNING] 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/src/main/scala/org/apache/spark/mapred/SparkHadoopMapRedUtil.scala:56:
 constructor TaskAttemptID in class TaskAttemptID is deprecated: see 
corresponding Javadoc for more information.
[WARNING]     new TaskAttemptID(jtIdentifier, jobId, isMap, taskId, attemptId)
[WARNING]     ^
[WARNING] 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/src/main/scala/org/apache/spark/rdd/CheckpointRDD.scala:110:
 method getDefaultReplication in class FileSystem is deprecated: see 
corresponding Javadoc for more information.
[WARNING]       fs.create(tempOutputPath, false, bufferSize, 
fs.getDefaultReplication, blockSize)
[WARNING]                                                       ^
[WARNING] 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/src/main/scala/org/apache/spark/rdd/HadoopRDD.scala:349:
 constructor TaskID in class TaskID is deprecated: see corresponding Javadoc 
for more information.
[WARNING]     val taId = new TaskAttemptID(new TaskID(jobID, true, splitId), 
attemptId)
[WARNING]                                  ^
[WARNING] 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/src/main/scala/org/apache/spark/rdd/PairRDDFunctions.scala:884:
 constructor Job in class Job is deprecated: see corresponding Javadoc for more 
information.
[WARNING]     val job = new NewAPIHadoopJob(hadoopConf)
[WARNING]               ^
[WARNING] 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/src/main/scala/org/apache/spark/rdd/PairRDDFunctions.scala:952:
 constructor Job in class Job is deprecated: see corresponding Javadoc for more 
information.
[WARNING]     val job = new NewAPIHadoopJob(hadoopConf)
[WARNING]               ^
[WARNING] 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/src/main/scala/org/apache/spark/scheduler/EventLoggingListener.scala:202:
 method isDir in class FileStatus is deprecated: see corresponding Javadoc for 
more information.
[WARNING]           fileStatuses.filter(!_.isDir).map(_.getPath).toSeq
[WARNING]                                  ^
[WARNING] 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/src/main/scala/org/apache/spark/scheduler/InputFormatInfo.scala:106:
 constructor Job in class Job is deprecated: see corresponding Javadoc for more 
information.
[WARNING]     val job = new Job(conf)
[WARNING]               ^
[WARNING] 17 warnings found
[WARNING] warning: [options] bootstrap class path not set in conjunction with 
-source 1.6
[WARNING] 1 warning
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ 
spark-core_2.10 ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 33 source files to 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/target/scala-2.10/classes
[INFO] 
[INFO] --- build-helper-maven-plugin:1.8:add-test-source 
(add-scala-test-sources) @ spark-core_2.10 ---
[INFO] Test Source directory: 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/src/test/scala added.
[INFO] 
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ 
spark-core_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 4 resources
[INFO] Copying 3 resources
[INFO] 
[INFO] --- scala-maven-plugin:3.2.0:testCompile (scala-test-compile-first) @ 
spark-core_2.10 ---
[WARNING] Zinc server is not available at port 3030 - reverting to normal 
incremental compile
[INFO] Using incremental compilation
[INFO] compiler plugin: 
BasicArtifact(org.scalamacros,paradise_2.10.4,2.0.1,null)
[INFO] Compiling 123 Scala sources and 4 Java sources to 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/target/scala-2.10/test-classes...
[WARNING] 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/src/test/scala/org/apache/spark/FileSuite.scala:497:
 constructor Job in class Job is deprecated: see corresponding Javadoc for more 
information.
[WARNING]     val job = new Job(sc.hadoopConfiguration)
[WARNING]               ^
[WARNING] 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/src/test/scala/org/apache/spark/metrics/InputOutputMetricsSuite.scala:34:
 trait ShouldMatchers in package matchers is deprecated: Please use 
org.scalatest.Matchers instead.
[WARNING] class InputOutputMetricsSuite extends FunSuite with 
SharedSparkContext with ShouldMatchers {
[WARNING]                                                                       
      ^
[WARNING] 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/src/test/scala/org/apache/spark/scheduler/EventLoggingListenerSuite.scala:177:
 method isDir in class FileStatus is deprecated: see corresponding Javadoc for 
more information.
[WARNING]     assert(logDir.isDir)
[WARNING]                   ^
[WARNING] 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/src/test/scala/org/apache/spark/scheduler/ReplayListenerSuite.scala:124:
 method isDir in class FileStatus is deprecated: see corresponding Javadoc for 
more information.
[WARNING]     assert(eventLogDir.isDir)
[WARNING]                        ^
[WARNING] 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/src/test/scala/org/apache/spark/util/FileLoggerSuite.scala:106:
 method isDir in class FileStatus is deprecated: see corresponding Javadoc for 
more information.
[WARNING]     assert(fileSystem.getFileStatus(logDirPath).isDir)
[WARNING]                                                 ^
[WARNING] 5 warnings found
[WARNING] warning: [options] bootstrap class path not set in conjunction with 
-source 1.6
[WARNING] Note: 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/src/test/java/org/apache/spark/JavaAPISuite.java
 uses or overrides a deprecated API.
[WARNING] Note: Recompile with -Xlint:deprecation for details.
[WARNING] Note: 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/src/test/java/org/apache/spark/JavaAPISuite.java
 uses unchecked or unsafe operations.
[WARNING] Note: Recompile with -Xlint:unchecked for details.
[WARNING] 1 warning
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ 
spark-core_2.10 ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 4 source files to 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/target/scala-2.10/test-classes
[INFO] 
[INFO] --- maven-dependency-plugin:2.9:build-classpath (default) @ 
spark-core_2.10 ---
[INFO] Wrote classpath file 
'/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/target/spark-test-classpath.txt'.
[INFO] 
[INFO] --- gmavenplus-plugin:1.2:execute (default) @ spark-core_2.10 ---
[INFO] Using Groovy 2.3.7 to perform execute.
[INFO] 
[INFO] --- maven-surefire-plugin:2.17:test (default-test) @ spark-core_2.10 ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- scalatest-maven-plugin:1.0:test (test) @ spark-core_2.10 ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ spark-core_2.10 ---
[INFO] Building jar: 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/target/spark-core_2.10-1.2.0.jar
[INFO] 
[INFO] --- maven-site-plugin:3.3:attach-descriptor (attach-descriptor) @ 
spark-core_2.10 ---
[INFO] 
[INFO] --- maven-dependency-plugin:2.9:copy-dependencies (copy-dependencies) @ 
spark-core_2.10 ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Spark Project Parent POM .......................... SUCCESS [7.016s]
[INFO] Spark Project Networking .......................... SUCCESS [33.314s]
[INFO] Spark Project Shuffle Streaming Service ........... SUCCESS [6.220s]
[INFO] Spark Project Core ................................ FAILURE [4:06.239s]
[INFO] Spark Project Bagel ............................... SKIPPED
[INFO] Spark Project GraphX .............................. SKIPPED
[INFO] Spark Project Streaming ........................... SKIPPED
[INFO] Spark Project Catalyst ............................ SKIPPED
[INFO] Spark Project SQL ................................. SKIPPED
[INFO] Spark Project ML Library .......................... SKIPPED
[INFO] Spark Project Tools ............................... SKIPPED
[INFO] Spark Project Hive ................................ SKIPPED
[INFO] Spark Project REPL ................................ SKIPPED
[INFO] Spark Project YARN Parent POM ..................... SKIPPED
[INFO] Spark Project YARN Stable API ..................... SKIPPED
[INFO] Spark Project Assembly ............................ SKIPPED
[INFO] Spark Project External Twitter .................... SKIPPED
[INFO] Spark Project External Flume Sink ................. SKIPPED
[INFO] Spark Project External Flume ...................... SKIPPED
[INFO] Spark Project External MQTT ....................... SKIPPED
[INFO] Spark Project External ZeroMQ ..................... SKIPPED
[INFO] Spark Project External Kafka ...................... SKIPPED
[INFO] Spark Project Examples ............................ SKIPPED
[INFO] Spark Project YARN Shuffle Service ................ SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 4:53.909s
[INFO] Finished at: Wed Mar 11 06:57:56 GMT 2015
[INFO] Final Memory: 69M/605M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal 
org.apache.maven.plugins:maven-dependency-plugin:2.9:copy-dependencies 
(copy-dependencies) on project spark-core_2.10: Error copying artifact from 
/wrkdirs/usr/ports/devel/spark/work/m2/com/google/guava/guava/14.0.1/guava-14.0.1.jar
 to 
/wrkdirs/usr/ports/devel/spark/work/spark-1.2.0/core/target/jars/guava-14.0.1.jar:
 Map failed -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :spark-core_2.10
*** [do-build] Error code 1

Stop in /usr/ports/devel/spark.
_______________________________________________
freebsd-pkg-fallout@freebsd.org mailing list
https://lists.freebsd.org/mailman/listinfo/freebsd-pkg-fallout
To unsubscribe, send any mail to "freebsd-pkg-fallout-unsubscr...@freebsd.org"

Reply via email to