Github user patrungel commented on the pull request:

    https://github.com/apache/spark/pull/8669#issuecomment-150209426
  
    The change aims to make spark binaries launch properly when symlinked, not 
just to avoid setting `SPARK_HOME`.
    As of current master, ```SPARK_HOME``` value is overwritten in launchers by 
a 
    ```
    export SPARK_HOME="$(cd "`dirname "$0"`"/..; pwd)"
    ```
    which is generically wrong and does not work for a symlinked set-up.
    
    This behaviour prohibits proper ( = with alternatives) 
installations/packaging.
    
    So, if 'resolve this as wont-fix' refers to the issue, I beg to differ 
(this discussion should rather be moved to the ticket then).
    If it refers just to the change suggested, I don't protest it.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to