[ 
https://issues.apache.org/jira/browse/SPARK-2960?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14743197#comment-14743197
 ] 

Sean Owen commented on SPARK-2960:
----------------------------------

As an addendum, I think the title is misleading. It's really more like, "if 
Spark scripts are symlinked, they work, but the logic to discover the Spark 
installation this way doesn't". Shoudl this not be configured by {{SPARK_HOME}} 
and/or {{SPARK_CONF_DIR}} if set? right now some of these scripts don't use 
them or override them unilaterally.

> Spark executables fail to start via symlinks
> --------------------------------------------
>
>                 Key: SPARK-2960
>                 URL: https://issues.apache.org/jira/browse/SPARK-2960
>             Project: Spark
>          Issue Type: Bug
>          Components: Deploy
>            Reporter: Shay Rojansky
>            Priority: Minor
>
> The current scripts (e.g. pyspark) fail to run when they are executed via 
> symlinks. A common Linux scenario would be to have Spark installed somewhere 
> (e.g. /opt) and have a symlink to it in /usr/bin.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to