[ 
https://issues.apache.org/jira/browse/SPARK-5389?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14567468#comment-14567468
 ] 

Mark Smiley commented on SPARK-5389:
------------------------------------

I have tried several settings for JAVA_HOME (C:\jdk1.8.0\bin, C:\jdk1.8.0\bin\, 
C:\jdk1.8.0, C:\jdk1.8.0\, even C:\jdk1.8.0\jre). None fixed the issue. I use 
Java a lot, and other apps (e.g., NetBeans) seem to have no issue with the 
JAVA_HOME setting. Note there are no spaces in the JAVA_HOME path. There is a 
space in the path to Scala, but that's the default installation path for Scala.
Also verified the same issue on Windows 8.1.

> spark-shell.cmd does not run from DOS Windows 7
> -----------------------------------------------
>
>                 Key: SPARK-5389
>                 URL: https://issues.apache.org/jira/browse/SPARK-5389
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark, Spark Shell, Windows
>    Affects Versions: 1.2.0
>         Environment: Windows 7
>            Reporter: Yana Kadiyska
>         Attachments: SparkShell_Win7.JPG, spark_bug.png
>
>
> spark-shell.cmd crashes in DOS prompt Windows 7. Works fine under PowerShell. 
> spark-shell.cmd works fine for me in v.1.1 so this is new in spark1.2
> Marking as trivial since calling spark-shell2.cmd also works fine
> Attaching a screenshot since the error isn't very useful:
> {code}
> spark-1.2.0-bin-cdh4>bin\spark-shell.cmd
> else was unexpected at this time.
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to