RE: building runnable distribution from source
Thanks, that solved it. If there is a developer here, it would be useful if this error would be marked as error instead of INFO (especially since this causes core to fail instead of an R package). Thanks, Assaf. -Original Message- From: Ding Fei [mailto:ding...@stars.org.cn] Sent: Thursday, September 29, 2016 1:20 PM To: Mendelson, Assaf Cc: user@spark.apache.org Subject: Re: building runnable distribution from source Check that your R is properly installed: >Cannot find 'R_HOME'. Please specify 'R_HOME' or make sure R is properly >installed. On Thu, 2016-09-29 at 01:08 -0700, AssafMendelson wrote: > Hi, > > I am trying to compile the latest branch of spark in order to try out > some code I wanted to contribute. > > > I was looking at the instructions to build from > http://spark.apache.org/docs/latest/building-spark.html > > So at first I did: > > ./build/mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -DskipTests > clean package > > This worked without a problem and compiled. > > > > I then did > > ./dev/make-distribution.sh --name custom-spark --tgz -e -Psparkr > -Phadoop-2.4 -Phive -Phive-thriftserver -Pyarn > > Which failed. > > (I added the –e because the first run, without it suggested adding > this to get more information). > > If I look at the compilation itself, It provides no messages for spark > project core: > > > > [INFO] Building Spark Project Core 2.1.0-SNAPSHOT > > [INFO] > -- > -- > > [INFO] > > > [INFO] > -- > -- > > [INFO] Building Spark Project YARN Shuffle Service 2.1.0-SNAPSHOT > > [INFO] > -- > - > > > > However, when I reach the summary I find that core has failed to > compile. > > Below is the messages from the end of the compilation but I can’t find > any direct error. > > I tried to google this but found no solution. Could anyone point me to > how to fix this? > > > > > > [INFO] --- maven-compiler-plugin:3.5.1:compile (default-compile) @ > spark-core_2.11 --- > > [INFO] Changes detected - recompiling the module! > > [INFO] Compiling 74 source files > to /home/mendea3/git/spark/core/target/scala-2.11/classes > > [INFO] > > [INFO] --- exec-maven-plugin:1.4.0:exec (sparkr-pkg) @ spark-core_2.11 > --- > > Cannot find 'R_HOME'. Please specify 'R_HOME' or make sure R is > properly installed. > > [INFO] > -- > -- > > [INFO] Reactor Summary: > > [INFO] > > [INFO] Spark Project Parent POM ... SUCCESS [ > 4.165 s] > > [INFO] Spark Project Tags . SUCCESS [ > 5.163 s] > > [INFO] Spark Project Sketch ... SUCCESS [ > 7.393 s] > > [INFO] Spark Project Networking ... SUCCESS [ > 18.929 s] > > [INFO] Spark Project Shuffle Streaming Service SUCCESS [ > 10.528 s] > > [INFO] Spark Project Unsafe ... SUCCESS [ > 14.453 s] > > [INFO] Spark Project Launcher . SUCCESS [ > 15.198 s] > > [INFO] Spark Project Core . FAILURE [ > 57.641 s] > > [INFO] Spark Project ML Local Library . SUCCESS [ > 10.561 s] > > [INFO] Spark Project GraphX ... SKIPPED > > [INFO] Spark Project Streaming SKIPPED > > [INFO] Spark Project Catalyst . SKIPPED > > [INFO] Spark Project SQL .. SKIPPED > > [INFO] Spark Project ML Library ... SKIPPED > > [INFO] Spark Project Tools SUCCESS [ > 4.188 s] > > [INFO] Spark Project Hive . SKIPPED > > [INFO] Spark Project REPL . SKIPPED > > [INFO] Spark Project YARN Shuffle Service . SUCCESS [ > 16.128 s] > > [INFO] Spark Project YARN . SKIPPED > > [INFO] Spark Project Hive Thrift Server ... SKIPPED > > [INFO] Spark Project Assembly . SKIPPED > > [INFO] Spark Project External Flume Sink .. SUCCESS [ > 9.855 s] > > [INFO] Spark Project Extern
Re: building runnable distribution from source
You may want to replace the 2.4 with a later release. On Sep 29, 2016, at 3:08 AM, AssafMendelson> wrote: Hi, I am trying to compile the latest branch of spark in order to try out some code I wanted to contribute. I was looking at the instructions to build from http://spark.apache.org/docs/latest/building-spark.html So at first I did: ./build/mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -DskipTests clean package This worked without a problem and compiled. I then did ./dev/make-distribution.sh --name custom-spark --tgz -e -Psparkr -Phadoop-2.4 -Phive -Phive-thriftserver -Pyarn Which failed. (I added the –e because the first run, without it suggested adding this to get more information). If I look at the compilation itself, It provides no messages for spark project core: [INFO] Building Spark Project Core 2.1.0-SNAPSHOT [INFO] [INFO] [INFO] [INFO] Building Spark Project YARN Shuffle Service 2.1.0-SNAPSHOT [INFO] --- However, when I reach the summary I find that core has failed to compile. Below is the messages from the end of the compilation but I can’t find any direct error. I tried to google this but found no solution. Could anyone point me to how to fix this? [INFO] --- maven-compiler-plugin:3.5.1:compile (default-compile) @ spark-core_2.11 --- [INFO] Changes detected - recompiling the module! [INFO] Compiling 74 source files to /home/mendea3/git/spark/core/target/scala-2.11/classes [INFO] [INFO] --- exec-maven-plugin:1.4.0:exec (sparkr-pkg) @ spark-core_2.11 --- Cannot find 'R_HOME'. Please specify 'R_HOME' or make sure R is properly installed. [INFO] [INFO] Reactor Summary: [INFO] [INFO] Spark Project Parent POM ... SUCCESS [ 4.165 s] [INFO] Spark Project Tags . SUCCESS [ 5.163 s] [INFO] Spark Project Sketch ... SUCCESS [ 7.393 s] [INFO] Spark Project Networking ... SUCCESS [ 18.929 s] [INFO] Spark Project Shuffle Streaming Service SUCCESS [ 10.528 s] [INFO] Spark Project Unsafe ... SUCCESS [ 14.453 s] [INFO] Spark Project Launcher . SUCCESS [ 15.198 s] [INFO] Spark Project Core . FAILURE [ 57.641 s] [INFO] Spark Project ML Local Library . SUCCESS [ 10.561 s] [INFO] Spark Project GraphX ... SKIPPED [INFO] Spark Project Streaming SKIPPED [INFO] Spark Project Catalyst . SKIPPED [INFO] Spark Project SQL .. SKIPPED [INFO] Spark Project ML Library ... SKIPPED [INFO] Spark Project Tools SUCCESS [ 4.188 s] [INFO] Spark Project Hive . SKIPPED [INFO] Spark Project REPL . SKIPPED [INFO] Spark Project YARN Shuffle Service . SUCCESS [ 16.128 s] [INFO] Spark Project YARN . SKIPPED [INFO] Spark Project Hive Thrift Server ... SKIPPED [INFO] Spark Project Assembly . SKIPPED [INFO] Spark Project External Flume Sink .. SUCCESS [ 9.855 s] [INFO] Spark Project External Flume ... SKIPPED [INFO] Spark Project External Flume Assembly .. SKIPPED [INFO] Spark Integration for Kafka 0.8 SKIPPED [INFO] Spark Project Examples . SKIPPED [INFO] Spark Project External Kafka Assembly .. SKIPPED [INFO] Spark Integration for Kafka 0.10 ... SKIPPED [INFO] Spark Integration for Kafka 0.10 Assembly .. SKIPPED [INFO] Spark Project Java 8 Tests . SKIPPED [INFO] [INFO] BUILD FAILURE [INFO] [INFO] Total time: 01:52 min (Wall Clock) [INFO] Finished at: 2016-09-29T10:48:57+03:00 [INFO] Final Memory: 49M/771M [INFO] [ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.4.0:exec (sparkr-pkg) on project spark-core_2.11: Command execution failed. Process exited with an error: 1 (Exit value: 1) -> [Help 1] org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.4.0:exec (sparkr-pkg) on project spark-core_2.11: Command execution failed. at
Re: building runnable distribution from source
Check that your R is properly installed: >Cannot find 'R_HOME'. Please specify 'R_HOME' or make sure R is properly >installed. On Thu, 2016-09-29 at 01:08 -0700, AssafMendelson wrote: > Hi, > > I am trying to compile the latest branch of spark in order to try out > some code I wanted to contribute. > > > I was looking at the instructions to build from > http://spark.apache.org/docs/latest/building-spark.html > > So at first I did: > > ./build/mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -DskipTests > clean package > > This worked without a problem and compiled. > > > > I then did > > ./dev/make-distribution.sh --name custom-spark --tgz -e -Psparkr > -Phadoop-2.4 -Phive -Phive-thriftserver -Pyarn > > Which failed. > > (I added the –e because the first run, without it suggested adding > this to get more information). > > If I look at the compilation itself, It provides no messages for spark > project core: > > > > [INFO] Building Spark Project Core 2.1.0-SNAPSHOT > > [INFO] > > > [INFO] > > > [INFO] > > > [INFO] Building Spark Project YARN Shuffle Service 2.1.0-SNAPSHOT > > [INFO] > --- > > > > However, when I reach the summary I find that core has failed to > compile. > > Below is the messages from the end of the compilation but I can’t find > any direct error. > > I tried to google this but found no solution. Could anyone point me to > how to fix this? > > > > > > [INFO] --- maven-compiler-plugin:3.5.1:compile (default-compile) @ > spark-core_2.11 --- > > [INFO] Changes detected - recompiling the module! > > [INFO] Compiling 74 source files > to /home/mendea3/git/spark/core/target/scala-2.11/classes > > [INFO] > > [INFO] --- exec-maven-plugin:1.4.0:exec (sparkr-pkg) @ spark-core_2.11 > --- > > Cannot find 'R_HOME'. Please specify 'R_HOME' or make sure R is > properly installed. > > [INFO] > > > [INFO] Reactor Summary: > > [INFO] > > [INFO] Spark Project Parent POM ... SUCCESS [ > 4.165 s] > > [INFO] Spark Project Tags . SUCCESS [ > 5.163 s] > > [INFO] Spark Project Sketch ... SUCCESS [ > 7.393 s] > > [INFO] Spark Project Networking ... SUCCESS > [ 18.929 s] > > [INFO] Spark Project Shuffle Streaming Service SUCCESS > [ 10.528 s] > > [INFO] Spark Project Unsafe ... SUCCESS > [ 14.453 s] > > [INFO] Spark Project Launcher . SUCCESS > [ 15.198 s] > > [INFO] Spark Project Core . FAILURE > [ 57.641 s] > > [INFO] Spark Project ML Local Library . SUCCESS > [ 10.561 s] > > [INFO] Spark Project GraphX ... SKIPPED > > [INFO] Spark Project Streaming SKIPPED > > [INFO] Spark Project Catalyst . SKIPPED > > [INFO] Spark Project SQL .. SKIPPED > > [INFO] Spark Project ML Library ... SKIPPED > > [INFO] Spark Project Tools SUCCESS [ > 4.188 s] > > [INFO] Spark Project Hive . SKIPPED > > [INFO] Spark Project REPL . SKIPPED > > [INFO] Spark Project YARN Shuffle Service . SUCCESS > [ 16.128 s] > > [INFO] Spark Project YARN . SKIPPED > > [INFO] Spark Project Hive Thrift Server ... SKIPPED > > [INFO] Spark Project Assembly . SKIPPED > > [INFO] Spark Project External Flume Sink .. SUCCESS [ > 9.855 s] > > [INFO] Spark Project External Flume ... SKIPPED > > [INFO] Spark Project External Flume Assembly .. SKIPPED > > [INFO] Spark Integration for Kafka 0.8 SKIPPED > > [INFO] Spark Project Examples . SKIPPED > > [INFO] Spark Project External Kafka Assembly .. SKIPPED > > [INFO] Spark Integration for Kafka 0.10 ... SKIPPED > > [INFO] Spark Integration for Kafka 0.10 Assembly .. SKIPPED > > [INFO] Spark Project Java 8 Tests . SKIPPED > > [INFO] > > > [INFO] BUILD FAILURE > > [INFO] > > > [INFO] Total time: 01:52 min (Wall Clock) > > [INFO] Finished at: 2016-09-29T10:48:57+03:00 > > [INFO] Final Memory: 49M/771M > > [INFO] >