[ 
https://issues.apache.org/jira/browse/SPARK-5389?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14348309#comment-14348309
 ] 

Masayoshi TSUZUKI commented on SPARK-5389:
------------------------------------------

The crashed program "findstr.exe" in the screenshot seems not to be the one in 
the C:\Windows\System32 directory.
I'm not sure but I think "C:\Windows\System32\findstr.exe" in Windows 7 shows 
"(QGREP) utililty" but not "(grep) utility".
(Although I don't know the exact English "name" since I'm not using English 
version of Windows.)

[~yanakad], [~s@r@v@n@n], and [SPARK-6084] seem to be reporting the similar 
problems.
Their workarounds shows that the cause might be the polluted %PATH%.
The collision of "find.exe" is well known phenomenon in Windows, but like 
Linux, the order of %PATH% can control which program is called.
If you face the similar problem, you can check by executing the command 
{{whereas find}} to check if the proper program "find.exe" is used.

Would you mind attaching the result of these commands?
{quote}
  where find
  where findstr
  echo %PATH%
{quote}

> spark-shell.cmd does not run from DOS Windows 7
> -----------------------------------------------
>
>                 Key: SPARK-5389
>                 URL: https://issues.apache.org/jira/browse/SPARK-5389
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 1.2.0
>         Environment: Windows 7
>            Reporter: Yana Kadiyska
>         Attachments: SparkShell_Win7.JPG
>
>
> spark-shell.cmd crashes in DOS prompt Windows 7. Works fine under PowerShell. 
> spark-shell.cmd works fine for me in v.1.1 so this is new in spark1.2
> Marking as trivial since calling spark-shell2.cmd also works fine
> Attaching a screenshot since the error isn't very useful:
> {code}
> spark-1.2.0-bin-cdh4>bin\spark-shell.cmd
> else was unexpected at this time.
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to