[ 
https://issues.apache.org/jira/browse/SPARK-13915?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15197166#comment-15197166
 ] 

Sean Owen commented on SPARK-13915:
-----------------------------------

(It's not a CDH difference; the script is identical: 
https://github.com/cloudera/spark/blob/cdh5-1.5.0_5.5.2/bin/spark-submit )

I think this is due to a) a custom modification to spark-submit, b) a custom 
environment change, c) a custom build of Spark 1.5.1. symlinks themselves are 
not an issue. I'd have to close this without clarity on what the problem is in 
Spark trunk that is solved by this change, that has to do with symlinks.

> Allow bin/spark-submit to be called via symbolic link
> -----------------------------------------------------
>
>                 Key: SPARK-13915
>                 URL: https://issues.apache.org/jira/browse/SPARK-13915
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Submit
>         Environment: CentOS 6.6
> Tarbal spark distribution and CDH-5.x.x Spark version (both)
>            Reporter: Rafael Pecin Ferreira
>            Priority: Minor
>
> We have a CDH-5 cluster that comes with spark-1.5.0 and we needed to use 
> spark-1.5.1 for bug fix issues.
> When I set up the spark (out of the CDH box) to the system alternatives, it 
> created a sequence of symbolic links to the target spark installation.
> When I tried to run spark-submit, the bash process call the target with "$0" 
> as /usr/bin/spark-submit, but this script use the "$0" variable to locate its 
> deps and I was facing this messages:
> [hdfs@server01 ~]$ env spark-submit
> ls: cannot access /usr/assembly/target/scala-2.10: No such file or directory
> Failed to find Spark assembly in /usr/assembly/target/scala-2.10.
> You need to build Spark before running this program.
> I fixed the spark-submit script adding this lines:
> if [ -h "$0" ] ; then
>     checklink="$0";
>     while [ -h "$checklink" ] ; do
>         checklink=`readlink $checklink`
>     done
>     SPARK_HOME="$(cd "`dirname "$checklink"`"/..; pwd)";
> else
>     SPARK_HOME="$(cd "`dirname "$0"`"/..; pwd)";
> fi
> It would be very nice if this piece of code be put into the spark-submit 
> script to allow us to have multiple spark alternatives on the system.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to