[ 
https://issues.apache.org/jira/browse/SPARK-25651?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Nick Sutcliffe updated SPARK-25651:
-----------------------------------
    Description: 
I have multiple versions of spark on my computer, and in particular SPARK_HOME 
set to a spark 2.0.2 installation.

If I browse to the bin directory of my spark 2.3.2 installation and run 
spark-shell, it incorrectly uses my spark 2.0.2 installation for SPARK_HOME. It 
seems that in spark-shell2.cmd, previously it set SPARK_HOME as follows 
(verified in spark 2.0.2 and spark 2.2.0):

`set SPARK_HOME=%~dp0..`

However this is not present in spark 2.3.2, instead calling find-spark-home.cmd 
which appears to be incorrectly assuming to take the environment variable if it 
exists.

  was:
I have multiple versions of spark on my computer, and in particular SPARK_HOME 
set to a spark 2.0.2 installation.

If I browse to the bin directory of my spark 2.3.2 installation and run 
spark-shell, it incorrectly uses my spark 2.0.2 installation for SPARK_HOME. It 
seems that in spark-shell2.cmd, previously it set SPARK_HOME as follows 
(verified in spark 2.0.2 and spark 2.2.0):

`set SPARK_HOME=%~dp0..`

However this is not present in spark 2.3.2 and so it uses your environment 
defaults, resulting in the wrong version of spark starting.


> spark-shell gets wrong version of spark on windows
> --------------------------------------------------
>
>                 Key: SPARK-25651
>                 URL: https://issues.apache.org/jira/browse/SPARK-25651
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 2.3.2
>         Environment: Windows 10, running spark 2.3.2
>            Reporter: Nick Sutcliffe
>            Priority: Major
>
> I have multiple versions of spark on my computer, and in particular 
> SPARK_HOME set to a spark 2.0.2 installation.
> If I browse to the bin directory of my spark 2.3.2 installation and run 
> spark-shell, it incorrectly uses my spark 2.0.2 installation for SPARK_HOME. 
> It seems that in spark-shell2.cmd, previously it set SPARK_HOME as follows 
> (verified in spark 2.0.2 and spark 2.2.0):
> `set SPARK_HOME=%~dp0..`
> However this is not present in spark 2.3.2, instead calling 
> find-spark-home.cmd which appears to be incorrectly assuming to take the 
> environment variable if it exists.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to