[ https://issues.apache.org/jira/browse/SPARK-18136?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16175276#comment-16175276 ]
Jakub Nowacki commented on SPARK-18136: --------------------------------------- [PR|https://github.com/apache/spark/pull/19310] fixes how {{spark-class2.cmd}} looks for jars directory on Windows. It fails to find jars and start JVM as the condition for the env variable {{SPARK_JARS_DIR}} looks for {{%SPARK_HOME%\RELEASE}}, which is not included in the {{pip/conda}} build. Instead, it should look for {{%SPARK_HOME%\jars}}, which it is later referring to. The above fixes the errors while importing {{pyspark}} into Python and creating SparkSession, but there is still an issue calling {{pyspark.cmd}}. Namely, normal command call on command line, without path specification fails with {{System cannot find the path specified.}}. It is likely due to the script link being resolved to Script folder in Anaconda, e.g. {{C:\Tools\Anaconda3\Scripts\pyspark.cmd}}. If the script is run via the full path to the PySpark package, e.g. {{\Tools\Anaconda3\Lib\site-packages\pyspark\bin\pyspark.cmd}} it works fine. It is likely due to the fact that {{SPARK_HOME}} is resolved as follows {{set SPARK_HOME=%~dp0..}}, which in case of the system call resolves (likely) to {{\Tools\Anaconda3\}} and should resolve to {{\Tools\Anaconda3\Lib\site-packages\pyspark\}}. Since I dion't know CMD scripting that well, I haven't found solution to this issue yet, apart from the workaround, i.e. calling it with full (direct) path. > Make PySpark pip install works on windows > ----------------------------------------- > > Key: SPARK-18136 > URL: https://issues.apache.org/jira/browse/SPARK-18136 > Project: Spark > Issue Type: Improvement > Components: PySpark > Reporter: holdenk > > Make sure that pip installer for PySpark works on windows -- This message was sent by Atlassian JIRA (v6.4.14#64029) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org