You may want to replace the 2.4 with a later release.

On Sep 29, 2016, at 3:08 AM, AssafMendelson 
<assaf.mendel...@rsa.com<mailto:assaf.mendel...@rsa.com>> wrote:

Hi,
I am trying to compile the latest branch of spark in order to try out some code 
I wanted to contribute.

I was looking at the instructions to build from 
http://spark.apache.org/docs/latest/building-spark.html
So at first I did:
./build/mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -DskipTests clean package
This worked without a problem and compiled.

I then did
./dev/make-distribution.sh --name custom-spark --tgz -e -Psparkr -Phadoop-2.4 
-Phive -Phive-thriftserver -Pyarn
Which failed.
(I added the –e because the first run, without it suggested adding this to get 
more information).
If I look at the compilation itself, It provides no messages for spark project 
core:

[INFO] Building Spark Project Core 2.1.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building Spark Project YARN Shuffle Service 2.1.0-SNAPSHOT
[INFO] -----------------------------------------------------------------------

However, when I reach the summary I find that core has failed to compile.
Below is the messages from the end of the compilation but I can’t find any 
direct error.
I tried to google this but found no solution. Could anyone point me to how to 
fix this?


[INFO] --- maven-compiler-plugin:3.5.1:compile (default-compile) @ 
spark-core_2.11 ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 74 source files to 
/home/mendea3/git/spark/core/target/scala-2.11/classes
[INFO]
[INFO] --- exec-maven-plugin:1.4.0:exec (sparkr-pkg) @ spark-core_2.11 ---
Cannot find 'R_HOME'. Please specify 'R_HOME' or make sure R is properly 
installed.
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM ........................... SUCCESS [  4.165 s]
[INFO] Spark Project Tags ................................. SUCCESS [  5.163 s]
[INFO] Spark Project Sketch ............................... SUCCESS [  7.393 s]
[INFO] Spark Project Networking ........................... SUCCESS [ 18.929 s]
[INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 10.528 s]
[INFO] Spark Project Unsafe ............................... SUCCESS [ 14.453 s]
[INFO] Spark Project Launcher ............................. SUCCESS [ 15.198 s]
[INFO] Spark Project Core ................................. FAILURE [ 57.641 s]
[INFO] Spark Project ML Local Library ..................... SUCCESS [ 10.561 s]
[INFO] Spark Project GraphX ............................... SKIPPED
[INFO] Spark Project Streaming ............................ SKIPPED
[INFO] Spark Project Catalyst ............................. SKIPPED
[INFO] Spark Project SQL .................................. SKIPPED
[INFO] Spark Project ML Library ........................... SKIPPED
[INFO] Spark Project Tools ................................ SUCCESS [  4.188 s]
[INFO] Spark Project Hive ................................. SKIPPED
[INFO] Spark Project REPL ................................. SKIPPED
[INFO] Spark Project YARN Shuffle Service ................. SUCCESS [ 16.128 s]
[INFO] Spark Project YARN ................................. SKIPPED
[INFO] Spark Project Hive Thrift Server ................... SKIPPED
[INFO] Spark Project Assembly ............................. SKIPPED
[INFO] Spark Project External Flume Sink .................. SUCCESS [  9.855 s]
[INFO] Spark Project External Flume ....................... SKIPPED
[INFO] Spark Project External Flume Assembly .............. SKIPPED
[INFO] Spark Integration for Kafka 0.8 .................... SKIPPED
[INFO] Spark Project Examples ............................. SKIPPED
[INFO] Spark Project External Kafka Assembly .............. SKIPPED
[INFO] Spark Integration for Kafka 0.10 ................... SKIPPED
[INFO] Spark Integration for Kafka 0.10 Assembly .......... SKIPPED
[INFO] Spark Project Java 8 Tests ......................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:52 min (Wall Clock)
[INFO] Finished at: 2016-09-29T10:48:57+03:00
[INFO] Final Memory: 49M/771M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.4.0:exec 
(sparkr-pkg) on project spark-core_2.11: Command execution failed. Process 
exited with an error: 1 (Exit value: 1) -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal 
org.codehaus.mojo:exec-maven-plugin:1.4.0:exec (sparkr-pkg) on project 
spark-core_2.11: Command execution failed.
        at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:212)
        at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
        at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
        at 
org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:116)
        at 
org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call(MultiThreadedBuilder.java:185)
        at 
org.apache.maven.lifecycle.internal.builder.multithreaded.MultiThreadedBuilder$1.call(MultiThreadedBuilder.java:181)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.maven.plugin.MojoExecutionException: Command execution 
failed.
        at org.codehaus.mojo.exec.ExecMojo.execute(ExecMojo.java:276)
        at 
org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:134)
        at 
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:207)
        ... 11 more
Caused by: org.apache.commons.exec.ExecuteException: Process exited with an 
error: 1 (Exit value: 1)
        at 
org.apache.commons.exec.DefaultExecutor.executeInternal(DefaultExecutor.java:404)
        at 
org.apache.commons.exec.DefaultExecutor.execute(DefaultExecutor.java:166)
        at org.codehaus.mojo.exec.ExecMojo.executeCommandLine(ExecMojo.java:660)
        at org.codehaus.mojo.exec.ExecMojo.execute(ExecMojo.java:265)
        ... 13 more
[ERROR]
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :spark-core_2.11


________________________________
View this message in context: building runnable distribution from 
source<http://apache-spark-user-list.1001560.n3.nabble.com/building-runnable-distribution-from-source-tp27808.html>
Sent from the Apache Spark User List mailing list 
archive<http://apache-spark-user-list.1001560.n3.nabble.com/> at 
Nabble.com<http://nabble.com>.

Reply via email to