Xin Ren created SPARK-15542:
-------------------------------

             Summary: Make error message clear for script './R/install-dev.sh' 
when R is missing on Mac
                 Key: SPARK-15542
                 URL: https://issues.apache.org/jira/browse/SPARK-15542
             Project: Spark
          Issue Type: Improvement
          Components: SparkR
    Affects Versions: 2.0.0
         Environment: Mac OS EI Captain
            Reporter: Xin Ren
            Priority: Minor


I followed instructions here https://github.com/apache/spark/tree/master/R to 
build sparkR project. When running {code}build/mvn -DskipTests -Psparkr 
package{code} then I got error below:
{code}
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Spark Project Parent POM ........................... SUCCESS [ 23.589 s]
[INFO] Spark Project Tags ................................. SUCCESS [ 19.389 s]
#!/bin/bash
[INFO] Spark Project Sketch ............................... SUCCESS [  6.386 s]
[INFO] Spark Project Networking ........................... SUCCESS [ 12.296 s]
[INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [  7.817 s]
[INFO] Spark Project Unsafe ............................... SUCCESS [ 10.825 s]
[INFO] Spark Project Launcher ............................. SUCCESS [ 12.262 s]
[INFO] Spark Project Core ................................. FAILURE [01:40 min]
[INFO] Spark Project GraphX ............................... SKIPPED
[INFO] Spark Project Streaming ............................ SKIPPED
[INFO] Spark Project Catalyst ............................. SKIPPED
[INFO] Spark Project SQL .................................. SKIPPED
[INFO] Spark Project ML Local Library ..................... SKIPPED
[INFO] Spark Project ML Library ........................... SKIPPED
[INFO] Spark Project Tools ................................ SKIPPED
[INFO] Spark Project Hive ................................. SKIPPED
[INFO] Spark Project REPL ................................. SKIPPED
[INFO] Spark Project Assembly ............................. SKIPPED
[INFO] Spark Project External Flume Sink .................. SKIPPED
[INFO] Spark Project External Flume ....................... SKIPPED
[INFO] Spark Project External Flume Assembly .............. SKIPPED
[INFO] Spark Integration for Kafka 0.8 .................... SKIPPED
[INFO] Spark Project Examples ............................. SKIPPED
[INFO] Spark Project External Kafka Assembly .............. SKIPPED
[INFO] Spark Project Java 8 Tests ......................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
#!/bin/bash
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 03:14 min
[INFO] Finished at: 2016-05-25T21:51:58+00:00
[INFO] Final Memory: 55M/782M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.codehaus.mojo:exec-maven-plugin:1.4.0:exec 
(sparkr-pkg) on project spark-core_2.11: Command execution failed. Process 
exited with an error: 1 (Exit value: 1) -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e 
switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please 
read the following articles:
[ERROR] [Help 1] 
http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :spark-core_2.11
{code}

and this error turned to be caused by {code}./R/install-dev.sh{code}

then I directly run this install-dev.sh script, and got 
{code}
mbp185-xr:spark quickmobile$ ./R/install-dev.sh
usage: dirname path
{code}

This message is very confusing to me, and then I found R is not properly 
configured on my Mac when this script is using {code}$(which R){code} to get R 
home.

I tried similar situation on CentOS with R missing, and it's giving me very 
clear error message while MacOS is not.

on CentOS: {code}
[root@ip-xxx-31-9-xx spark]# which R
/usr/bin/which: no R in 
(/usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin:/usr/lib/jvm/java-1.7.0-openjdk.x86_64/bin:/root/bin){code}

but on Mac, if not found then nothing returned and this is causing the 
confusing message for R build failure and running R/install-dev.sh: {code}
mbp185-xr:spark xin$ which R
mbp185-xr:spark xin$
 {code}

So a more clear message needed for this miss configuration for R when running 
R/install-dev.sh.




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to