fixed! after adding the option -DskipTests everything build ok.
Thanks Sean for your help

On Thu, Aug 4, 2016 at 8:18 PM, Richard Siebeling <rsiebel...@gmail.com>
wrote:

> I don't see any other errors, these are the last lines of the
> make-distribution log.
> Above these lines there are no errors...
>
>
> [INFO] Building jar: /opt/mapr/spark/spark-2.0.0/
> common/network-yarn/target/spark-network-yarn_2.11-2.0.0-test-sources.jar
> [warn] /opt/mapr/spark/spark-2.0.0/core/src/main/scala/org/
> apache/spark/api/python/PythonRDD.scala:78: class Accumulator in package
> spark is deprecated: use AccumulatorV2
> [warn]     accumulator: Accumulator[JList[Array[Byte]]])
> [warn]                  ^
> [warn] /opt/mapr/spark/spark-2.0.0/core/src/main/scala/org/
> apache/spark/api/python/PythonRDD.scala:71: class Accumulator in package
> spark is deprecated: use AccumulatorV2
> [warn] private[spark] case class PythonFunction(
> [warn]                           ^
> [warn] /opt/mapr/spark/spark-2.0.0/core/src/main/scala/org/
> apache/spark/api/python/PythonRDD.scala:873: trait AccumulatorParam in
> package spark is deprecated: use AccumulatorV2
> [warn]   extends AccumulatorParam[JList[Array[Byte]]] {
> [warn]           ^
> [warn] /opt/mapr/spark/spark-2.0.0/core/src/main/scala/org/
> apache/spark/util/AccumulatorV2.scala:459: trait AccumulableParam in
> package spark is deprecated: use AccumulatorV2
> [warn]     param: org.apache.spark.AccumulableParam[R, T]) extends
> AccumulatorV2[T, R] {
> [warn]                             ^
> [warn] four warnings found
> [error] warning: [options] bootstrap class path not set in conjunction
> with -source 1.7
> [error] Compile failed at Aug 3, 2016 2:13:07 AM [1:12.769s]
> [INFO] ------------------------------------------------------------
> ------------
> [INFO] Reactor Summary:
> [INFO]
> [INFO] Spark Project Parent POM ........................... SUCCESS [
>  3.850 s]
> [INFO] Spark Project Tags ................................. SUCCESS [
>  6.053 s]
> [INFO] Spark Project Sketch ............................... SUCCESS [
>  9.977 s]
> [INFO] Spark Project Networking ........................... SUCCESS [
> 17.696 s]
> [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [
>  8.864 s]
> [INFO] Spark Project Unsafe ............................... SUCCESS [
> 17.485 s]
> [INFO] Spark Project Launcher ............................. SUCCESS [
> 19.551 s]
> [INFO] Spark Project Core ................................. FAILURE
> [01:19 min]
> [INFO] Spark Project GraphX ............................... SKIPPED
> [INFO] Spark Project Streaming ............................ SKIPPED
> [INFO] Spark Project Catalyst ............................. SKIPPED
> [INFO] Spark Project SQL .................................. SKIPPED
> [INFO] Spark Project ML Local Library ..................... SUCCESS [
> 19.594 s]
> [INFO] Spark Project ML Library ........................... SKIPPED
> [INFO] Spark Project Tools ................................ SUCCESS [
>  6.972 s]
> [INFO] Spark Project Hive ................................. SKIPPED
> [INFO] Spark Project REPL ................................. SKIPPED
> [INFO] Spark Project YARN Shuffle Service ................. SUCCESS [
> 12.019 s]
> [INFO] Spark Project YARN ................................. SKIPPED
> [INFO] Spark Project Assembly ............................. SKIPPED
> [INFO] Spark Project External Flume Sink .................. SUCCESS [
> 13.460 s]
> [INFO] Spark Project External Flume ....................... SKIPPED
> [INFO] Spark Project External Flume Assembly .............. SKIPPED
> [INFO] Spark Integration for Kafka 0.8 .................... SKIPPED
> [INFO] Spark Project Examples ............................. SKIPPED
> [INFO] Spark Project External Kafka Assembly .............. SKIPPED
> [INFO] Spark Integration for Kafka 0.10 ................... SKIPPED
> [INFO] Spark Integration for Kafka 0.10 Assembly .......... SKIPPED
> [INFO] ------------------------------------------------------------
> ------------
> [INFO] BUILD FAILURE
> [INFO] ------------------------------------------------------------
> ------------
> [INFO] Total time: 02:08 min (Wall Clock)
> [INFO] Finished at: 2016-08-03T02:13:07+02:00
> [INFO] Final Memory: 54M/844M
> [INFO] ------------------------------------------------------------
> ------------
> [ERROR] Failed to execute goal 
> net.alchim31.maven:scala-maven-plugin:3.2.2:compile
> (scala-compile-first) on project spark-core_2.11: Execution
> scala-compile-first of goal 
> net.alchim31.maven:scala-maven-plugin:3.2.2:compile
> failed. CompileFailed -> [Help 1]
> [ERROR]
> [ERROR] To see the full stack trace of the errors, re-run Maven with the
> -e switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR]
> [ERROR] For more information about the errors and possible solutions,
> please read the following articles:
> [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/
> PluginExecutionException
> [ERROR]
> [ERROR] After correcting the problems, you can resume the build with the
> command
> [ERROR]   mvn <goals> -rf :spark-core_2.11
>
> On Thu, Aug 4, 2016 at 6:30 PM, Sean Owen <so...@cloudera.com> wrote:
>
>> That message is a warning, not error. It is just because you're cross
>> compiling with Java 8. If something failed it was elsewhere.
>>
>>
>> On Thu, Aug 4, 2016, 07:09 Richard Siebeling <rsiebel...@gmail.com>
>> wrote:
>>
>>> Hi,
>>>
>>> spark 2.0 with mapr hadoop libraries was succesfully build using the
>>> following command:
>>> ./build/mvn -Pyarn -Phadoop-2.7 -Dhadoop.version=2.7.0-mapr-1602
>>> -DskipTests clean package
>>>
>>> However when I then try to build a runnable distribution using the
>>> following command
>>> ./dev/make-distribution.sh --tgz -Pyarn -Phadoop-2.7
>>> -Dhadoop.version=2.7.0-mapr-1602
>>>
>>> It fails with the error "bootstrap class path not set in conjunction
>>> with -source 1.7"
>>> Could you please help? I do not know what this error means,
>>>
>>> thanks in advance,
>>> Richard
>>>
>>>
>>>
>

Reply via email to