[ 
https://issues.apache.org/jira/browse/SPARK-5389?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14517695#comment-14517695
 ] 

Mark Smiley edited comment on SPARK-5389 at 4/28/15 7:13 PM:
-------------------------------------------------------------

I have the same issue on Spark 1.3.1 using Windows 7 with both Java 8 and Java 
7.
The proposed workarounds don't work.
I did:
cd\spark\bin
 spark-shell

 Yields following error:
 find: 'version': No such file or directory
 else was unexpected at this time

Same error with 
 spark-shell2.cmd

Using regular windows command line and PowerShell have the same issue.

PyShell starts but with errors and doesn't work properly once started
 (e.g., can't find sc). Attached a screenshot of errors on startup.

Using Spark 1.3.1 for Hadoop 2.6 binary
 Note: Hadoop not installed on machine.
 Scala works by itself, Python works by itself
 Java works fine (I use it all the time)

Based on another comment, tried Java 7 (1.7.0_79), but it made no difference 
(same error).

Here are some relevant environment variables and the start of my path.

JAVA_HOME = C:\jdk1.8.0\bin
 C:\jdk1.8.0\bin\;C:\Program Files 
(x86)\scala\bin;C:\Python27;c:\Rtools\bin;c:\Rtools\gcc-4.6.3\bin;
etc.





was (Author: drfractal):
I have the same issue on Spark 1.3.1 using Windows 7 with both Java 8 and Java 
7.
The proposed workarounds don't work.
I did:
cd\spark\bin
 spark-shell

 Yields following error:
 find: 'version': No such file or directory
 else was unexpected at this time

Same error with 
 spark-shell2.cmd

Using regular windows command line and PowerShell have the same issue.

PyShell starts but with errors and doesn't work properly once started
 (e.g., can't find sc). Can send screenshot of errors on request.

Using Spark 1.3.1 for Hadoop 2.6 binary
 Note: Hadoop not installed on machine.
 Scala works by itself, Python works by itself
 Java works fine (I use it all the time)

Based on another comment, tried Java 7 (1.7.0_79), but it made no difference 
(same error).

Here are some relevant environment variables and the start of my path.

JAVA_HOME = C:\jdk1.8.0\bin
 C:\jdk1.8.0\bin\;C:\Program Files 
(x86)\scala\bin;C:\Python27;c:\Rtools\bin;c:\Rtools\gcc-4.6.3\bin;
etc.




> spark-shell.cmd does not run from DOS Windows 7
> -----------------------------------------------
>
>                 Key: SPARK-5389
>                 URL: https://issues.apache.org/jira/browse/SPARK-5389
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 1.2.0
>         Environment: Windows 7
>            Reporter: Yana Kadiyska
>         Attachments: SparkShell_Win7.JPG, spark_bug.png
>
>
> spark-shell.cmd crashes in DOS prompt Windows 7. Works fine under PowerShell. 
> spark-shell.cmd works fine for me in v.1.1 so this is new in spark1.2
> Marking as trivial since calling spark-shell2.cmd also works fine
> Attaching a screenshot since the error isn't very useful:
> {code}
> spark-1.2.0-bin-cdh4>bin\spark-shell.cmd
> else was unexpected at this time.
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to