[jira] [Commented] (SPARK-5389) spark-shell.cmd does not run from DOS Windows 7

2015-06-01 Thread Mark Smiley (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-5389?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14567468#comment-14567468
 ] 

Mark Smiley commented on SPARK-5389:


I have tried several settings for JAVA_HOME (C:\jdk1.8.0\bin, C:\jdk1.8.0\bin\, 
C:\jdk1.8.0, C:\jdk1.8.0\, even C:\jdk1.8.0\jre). None fixed the issue. I use 
Java a lot, and other apps (e.g., NetBeans) seem to have no issue with the 
JAVA_HOME setting. Note there are no spaces in the JAVA_HOME path. There is a 
space in the path to Scala, but that's the default installation path for Scala.
Also verified the same issue on Windows 8.1.

 spark-shell.cmd does not run from DOS Windows 7
 ---

 Key: SPARK-5389
 URL: https://issues.apache.org/jira/browse/SPARK-5389
 Project: Spark
  Issue Type: Bug
  Components: PySpark, Spark Shell, Windows
Affects Versions: 1.2.0
 Environment: Windows 7
Reporter: Yana Kadiyska
 Attachments: SparkShell_Win7.JPG, spark_bug.png


 spark-shell.cmd crashes in DOS prompt Windows 7. Works fine under PowerShell. 
 spark-shell.cmd works fine for me in v.1.1 so this is new in spark1.2
 Marking as trivial since calling spark-shell2.cmd also works fine
 Attaching a screenshot since the error isn't very useful:
 {code}
 spark-1.2.0-bin-cdh4bin\spark-shell.cmd
 else was unexpected at this time.
 {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Comment Edited] (SPARK-5389) spark-shell.cmd does not run from DOS Windows 7

2015-06-01 Thread Mark Smiley (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-5389?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14567468#comment-14567468
 ] 

Mark Smiley edited comment on SPARK-5389 at 6/1/15 3:54 PM:


I have tried several settings for JAVA_HOME (C:\jdk1.8.0\bin, C:\jdk1.8.0\bin\, 
C:\jdk1.8.0, C:\jdk1.8.0\, even C:\jdk1.8.0\jre). None fixed the issue. I use 
Java a lot, and other apps (e.g., NetBeans) seem to have no issue with the 
JAVA_HOME setting. Note there are no spaces in the JAVA_HOME path. There is a 
space in the path to Scala, but that's the default installation path for Scala.
There is no Java 6 on either of these systems.
Also verified the same issue on Windows 8.1.


was (Author: drfractal):
I have tried several settings for JAVA_HOME (C:\jdk1.8.0\bin, C:\jdk1.8.0\bin\, 
C:\jdk1.8.0, C:\jdk1.8.0\, even C:\jdk1.8.0\jre). None fixed the issue. I use 
Java a lot, and other apps (e.g., NetBeans) seem to have no issue with the 
JAVA_HOME setting. Note there are no spaces in the JAVA_HOME path. There is a 
space in the path to Scala, but that's the default installation path for Scala.
Also verified the same issue on Windows 8.1.

 spark-shell.cmd does not run from DOS Windows 7
 ---

 Key: SPARK-5389
 URL: https://issues.apache.org/jira/browse/SPARK-5389
 Project: Spark
  Issue Type: Bug
  Components: PySpark, Spark Shell, Windows
Affects Versions: 1.2.0
 Environment: Windows 7
Reporter: Yana Kadiyska
 Attachments: SparkShell_Win7.JPG, spark_bug.png


 spark-shell.cmd crashes in DOS prompt Windows 7. Works fine under PowerShell. 
 spark-shell.cmd works fine for me in v.1.1 so this is new in spark1.2
 Marking as trivial since calling spark-shell2.cmd also works fine
 Attaching a screenshot since the error isn't very useful:
 {code}
 spark-1.2.0-bin-cdh4bin\spark-shell.cmd
 else was unexpected at this time.
 {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-5389) spark-shell.cmd does not run from DOS Windows 7

2015-06-01 Thread Mark Smiley (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-5389?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14568317#comment-14568317
 ] 

Mark Smiley commented on SPARK-5389:


Sean, can you share your JAVA_HOME and the relevant parts of your path?  I've 
reproduced this error on 2 different machines. I'd like to resolve it, too.
Also, what versions of Java, Scala and Spark are you running?
Thanks

 spark-shell.cmd does not run from DOS Windows 7
 ---

 Key: SPARK-5389
 URL: https://issues.apache.org/jira/browse/SPARK-5389
 Project: Spark
  Issue Type: Bug
  Components: PySpark, Spark Shell, Windows
Affects Versions: 1.2.0
 Environment: Windows 7
Reporter: Yana Kadiyska
 Attachments: SparkShell_Win7.JPG, spark_bug.png


 spark-shell.cmd crashes in DOS prompt Windows 7. Works fine under PowerShell. 
 spark-shell.cmd works fine for me in v.1.1 so this is new in spark1.2
 Marking as trivial since calling spark-shell2.cmd also works fine
 Attaching a screenshot since the error isn't very useful:
 {code}
 spark-1.2.0-bin-cdh4bin\spark-shell.cmd
 else was unexpected at this time.
 {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Comment Edited] (SPARK-5389) spark-shell.cmd does not run from DOS Windows 7

2015-06-01 Thread Mark Smiley (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-5389?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14568317#comment-14568317
 ] 

Mark Smiley edited comment on SPARK-5389 at 6/2/15 12:52 AM:
-

Sean, can you share your JAVA_HOME and the relevant parts of your path 
(including your scala path)?  I've reproduced this error on 2 different 
machines. I'd like to resolve it, too.
Also, what versions of Java, Scala and Spark are you running?
Thanks


was (Author: drfractal):
Sean, can you share your JAVA_HOME and the relevant parts of your path?  I've 
reproduced this error on 2 different machines. I'd like to resolve it, too.
Also, what versions of Java, Scala and Spark are you running?
Thanks

 spark-shell.cmd does not run from DOS Windows 7
 ---

 Key: SPARK-5389
 URL: https://issues.apache.org/jira/browse/SPARK-5389
 Project: Spark
  Issue Type: Bug
  Components: PySpark, Spark Shell, Windows
Affects Versions: 1.2.0
 Environment: Windows 7
Reporter: Yana Kadiyska
 Attachments: SparkShell_Win7.JPG, spark_bug.png


 spark-shell.cmd crashes in DOS prompt Windows 7. Works fine under PowerShell. 
 spark-shell.cmd works fine for me in v.1.1 so this is new in spark1.2
 Marking as trivial since calling spark-shell2.cmd also works fine
 Attaching a screenshot since the error isn't very useful:
 {code}
 spark-1.2.0-bin-cdh4bin\spark-shell.cmd
 else was unexpected at this time.
 {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-7195) Can't start spark shell or pyspark in Windows 7

2015-04-28 Thread Mark Smiley (JIRA)
Mark Smiley created SPARK-7195:
--

 Summary: Can't start spark shell or pyspark in Windows 7
 Key: SPARK-7195
 URL: https://issues.apache.org/jira/browse/SPARK-7195
 Project: Spark
  Issue Type: Bug
  Components: PySpark, Spark Shell
Affects Versions: 1.3.1
 Environment: Windows 7, Java 8 (1.8.0_31) or Java 7 (1.7.0_79), Scala 
2.11.6, Python 2.7
Reporter: Mark Smiley


cd\spark\bin dir
spark-shell
yields following error:
find: 'version': No such file or directory
else was unexpected at this time

Same error with 
spark-shell2.cmd

PyShell starts but with errors and doesn't work properly once started
(e.g., can't find sc). Can send screenshot of errors on request.

Using Spark 1.3.1 for Hadoop 2.6 binary
Note: Hadoop not installed on machine.
Scala works by itself, Python works by itself
Java works fine (I use it all the time)

Based on another comment, tried Java 7 (1.7.0_79), but it made no difference 
(same error).

JAVA_HOME = C:\jdk1.8.0\bin
C:\jdk1.8.0\bin\;C:\Program Files 
(x86)\scala\bin;C:\Python27;c:\Rtools\bin;c:\Rtools\gcc-4.6.3\bin;C:\Oracle\product64\12.1.0\client_1\bin;C:\Oracle\product\12.1.0\client_1\bin;C:\ProgramData\Oracle\Java\javapath;C:\Program
 Files (x86)\Intel\iCLS Client\;C:\Program Files\Intel\iCLS 
Client\;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\Program
 Files\Intel\Intel(R) Management Engine Components\DAL;C:\Program 
Files\Intel\Intel(R) Management Engine Components\IPT;C:\Program Files 
(x86)\Intel\Intel(R) Management Engine Components\DAL;C:\Program Files 
(x86)\Intel\Intel(R) Management Engine Components\IPT;C:\Program 
Files\Dell\Dell Data Protection\Access\Advanced\Wave\Gemalto\Access 
Client\v5\;C:\Program Files (x86)\NTRU Cryptosystems\NTRU TCG Software 
Stack\bin\;C:\Program Files\NTRU Cryptosystems\NTRU TCG Software 
Stack\bin\;C:\Program Files (x86)\Intel\OpenCL SDK\2.0\bin\x86;C:\Program Files 
(x86)\Intel\OpenCL SDK\2.0\bin\x64;C:\Program Files\MiKTeX 
2.9\miktex\bin\x64\;C:\Program Files 
(x86)\ActivIdentity\ActivClient\;C:\Program Files\ActivIdentity\ActivClient\




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-7195) Can't start spark shell or pyspark in Windows 7

2015-04-28 Thread Mark Smiley (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-7195?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mark Smiley updated SPARK-7195:
---
Attachment: spark_bug.png

Sean,

I did look around, and had found the old bug and its duplicate, but the old bug 
was marked as closed and was for Spark 1.2.1.
I didn't see a way to reopen it (but I am new to Jira).

This bug is the latest release.
I also didn't see a way to add Spark 1.3.1 to the versions affected.

I also see a developer saying he can't reproduce it.  So this shows that it is 
reproducible.

Also, I had seen that the old bug had an attached file.
So I had hoped to attach a file showing the Python shell bugs, which may be 
related, and which weren't mentioned in the old bug.
But I couldn't find a way to attach a file. 

I'm attaching it here, so you can see the Python shell issues on startup.
The Python shell starts but it's not possible to do anything with Spark once it 
starts (e.g., there is no sc (Spark Context)).

Mark



 Can't start spark shell or pyspark in Windows 7
 ---

 Key: SPARK-7195
 URL: https://issues.apache.org/jira/browse/SPARK-7195
 Project: Spark
  Issue Type: Bug
  Components: PySpark, Spark Shell
Affects Versions: 1.3.1
 Environment: Windows 7, Java 8 (1.8.0_31) or Java 7 (1.7.0_79), Scala 
 2.11.6, Python 2.7
Reporter: Mark Smiley
 Attachments: spark_bug.png


 cd\spark\bin dir
 spark-shell
 yields following error:
 find: 'version': No such file or directory
 else was unexpected at this time
 Same error with 
 spark-shell2.cmd
 PyShell starts but with errors and doesn't work properly once started
 (e.g., can't find sc). Can send screenshot of errors on request.
 Using Spark 1.3.1 for Hadoop 2.6 binary
 Note: Hadoop not installed on machine.
 Scala works by itself, Python works by itself
 Java works fine (I use it all the time)
 Based on another comment, tried Java 7 (1.7.0_79), but it made no difference 
 (same error).
 JAVA_HOME = C:\jdk1.8.0\bin
 C:\jdk1.8.0\bin\;C:\Program Files 
 (x86)\scala\bin;C:\Python27;c:\Rtools\bin;c:\Rtools\gcc-4.6.3\bin;C:\Oracle\product64\12.1.0\client_1\bin;C:\Oracle\product\12.1.0\client_1\bin;C:\ProgramData\Oracle\Java\javapath;C:\Program
  Files (x86)\Intel\iCLS Client\;C:\Program Files\Intel\iCLS 
 Client\;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\Program
  Files\Intel\Intel(R) Management Engine Components\DAL;C:\Program 
 Files\Intel\Intel(R) Management Engine Components\IPT;C:\Program Files 
 (x86)\Intel\Intel(R) Management Engine Components\DAL;C:\Program Files 
 (x86)\Intel\Intel(R) Management Engine Components\IPT;C:\Program 
 Files\Dell\Dell Data Protection\Access\Advanced\Wave\Gemalto\Access 
 Client\v5\;C:\Program Files (x86)\NTRU Cryptosystems\NTRU TCG Software 
 Stack\bin\;C:\Program Files\NTRU Cryptosystems\NTRU TCG Software 
 Stack\bin\;C:\Program Files (x86)\Intel\OpenCL SDK\2.0\bin\x86;C:\Program 
 Files (x86)\Intel\OpenCL SDK\2.0\bin\x64;C:\Program Files\MiKTeX 
 2.9\miktex\bin\x64\;C:\Program Files 
 (x86)\ActivIdentity\ActivClient\;C:\Program Files\ActivIdentity\ActivClient\



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-5389) spark-shell.cmd does not run from DOS Windows 7

2015-04-28 Thread Mark Smiley (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-5389?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Mark Smiley updated SPARK-5389:
---
Attachment: spark_bug.png

Related python shell error messages on startup.

 spark-shell.cmd does not run from DOS Windows 7
 ---

 Key: SPARK-5389
 URL: https://issues.apache.org/jira/browse/SPARK-5389
 Project: Spark
  Issue Type: Bug
  Components: Spark Shell
Affects Versions: 1.2.0
 Environment: Windows 7
Reporter: Yana Kadiyska
 Attachments: SparkShell_Win7.JPG, spark_bug.png


 spark-shell.cmd crashes in DOS prompt Windows 7. Works fine under PowerShell. 
 spark-shell.cmd works fine for me in v.1.1 so this is new in spark1.2
 Marking as trivial since calling spark-shell2.cmd also works fine
 Attaching a screenshot since the error isn't very useful:
 {code}
 spark-1.2.0-bin-cdh4bin\spark-shell.cmd
 else was unexpected at this time.
 {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-5389) spark-shell.cmd does not run from DOS Windows 7

2015-04-28 Thread Mark Smiley (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-5389?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14517695#comment-14517695
 ] 

Mark Smiley commented on SPARK-5389:


I have the same issue on Spark 1.3.1 using Windows 7 with both Java 8 and Java 
7.
The proposed workarounds don't work.
I did:
cd\spark\bin
 spark-shell

 Yields following error:
 find: 'version': No such file or directory
 else was unexpected at this time

Same error with 
 spark-shell2.cmd

Using regular windows command line and PowerShell have the same issue.

PyShell starts but with errors and doesn't work properly once started
 (e.g., can't find sc). Can send screenshot of errors on request.

Using Spark 1.3.1 for Hadoop 2.6 binary
 Note: Hadoop not installed on machine.
 Scala works by itself, Python works by itself
 Java works fine (I use it all the time)

Based on another comment, tried Java 7 (1.7.0_79), but it made no difference 
(same error).

Here are some relevant environment variables and the start of my path.

JAVA_HOME = C:\jdk1.8.0\bin
 C:\jdk1.8.0\bin\;C:\Program Files 
(x86)\scala\bin;C:\Python27;c:\Rtools\bin;c:\Rtools\gcc-4.6.3\bin;
etc.




 spark-shell.cmd does not run from DOS Windows 7
 ---

 Key: SPARK-5389
 URL: https://issues.apache.org/jira/browse/SPARK-5389
 Project: Spark
  Issue Type: Bug
  Components: Spark Shell
Affects Versions: 1.2.0
 Environment: Windows 7
Reporter: Yana Kadiyska
 Attachments: SparkShell_Win7.JPG


 spark-shell.cmd crashes in DOS prompt Windows 7. Works fine under PowerShell. 
 spark-shell.cmd works fine for me in v.1.1 so this is new in spark1.2
 Marking as trivial since calling spark-shell2.cmd also works fine
 Attaching a screenshot since the error isn't very useful:
 {code}
 spark-1.2.0-bin-cdh4bin\spark-shell.cmd
 else was unexpected at this time.
 {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Comment Edited] (SPARK-5389) spark-shell.cmd does not run from DOS Windows 7

2015-04-28 Thread Mark Smiley (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-5389?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14517695#comment-14517695
 ] 

Mark Smiley edited comment on SPARK-5389 at 4/28/15 7:13 PM:
-

I have the same issue on Spark 1.3.1 using Windows 7 with both Java 8 and Java 
7.
The proposed workarounds don't work.
I did:
cd\spark\bin
 spark-shell

 Yields following error:
 find: 'version': No such file or directory
 else was unexpected at this time

Same error with 
 spark-shell2.cmd

Using regular windows command line and PowerShell have the same issue.

PyShell starts but with errors and doesn't work properly once started
 (e.g., can't find sc). Attached a screenshot of errors on startup.

Using Spark 1.3.1 for Hadoop 2.6 binary
 Note: Hadoop not installed on machine.
 Scala works by itself, Python works by itself
 Java works fine (I use it all the time)

Based on another comment, tried Java 7 (1.7.0_79), but it made no difference 
(same error).

Here are some relevant environment variables and the start of my path.

JAVA_HOME = C:\jdk1.8.0\bin
 C:\jdk1.8.0\bin\;C:\Program Files 
(x86)\scala\bin;C:\Python27;c:\Rtools\bin;c:\Rtools\gcc-4.6.3\bin;
etc.





was (Author: drfractal):
I have the same issue on Spark 1.3.1 using Windows 7 with both Java 8 and Java 
7.
The proposed workarounds don't work.
I did:
cd\spark\bin
 spark-shell

 Yields following error:
 find: 'version': No such file or directory
 else was unexpected at this time

Same error with 
 spark-shell2.cmd

Using regular windows command line and PowerShell have the same issue.

PyShell starts but with errors and doesn't work properly once started
 (e.g., can't find sc). Can send screenshot of errors on request.

Using Spark 1.3.1 for Hadoop 2.6 binary
 Note: Hadoop not installed on machine.
 Scala works by itself, Python works by itself
 Java works fine (I use it all the time)

Based on another comment, tried Java 7 (1.7.0_79), but it made no difference 
(same error).

Here are some relevant environment variables and the start of my path.

JAVA_HOME = C:\jdk1.8.0\bin
 C:\jdk1.8.0\bin\;C:\Program Files 
(x86)\scala\bin;C:\Python27;c:\Rtools\bin;c:\Rtools\gcc-4.6.3\bin;
etc.




 spark-shell.cmd does not run from DOS Windows 7
 ---

 Key: SPARK-5389
 URL: https://issues.apache.org/jira/browse/SPARK-5389
 Project: Spark
  Issue Type: Bug
  Components: Spark Shell
Affects Versions: 1.2.0
 Environment: Windows 7
Reporter: Yana Kadiyska
 Attachments: SparkShell_Win7.JPG, spark_bug.png


 spark-shell.cmd crashes in DOS prompt Windows 7. Works fine under PowerShell. 
 spark-shell.cmd works fine for me in v.1.1 so this is new in spark1.2
 Marking as trivial since calling spark-shell2.cmd also works fine
 Attaching a screenshot since the error isn't very useful:
 {code}
 spark-1.2.0-bin-cdh4bin\spark-shell.cmd
 else was unexpected at this time.
 {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Commented] (SPARK-7195) Can't start spark shell or pyspark in Windows 7

2015-04-28 Thread Mark Smiley (JIRA)

[ 
https://issues.apache.org/jira/browse/SPARK-7195?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14517826#comment-14517826
 ] 

Mark Smiley commented on SPARK-7195:


Sean,

I added my bug as a comment to the old bug and attached my file there.

Can you add PySpark as one of the components involved?  I don't have permission 
to do that.

Thanks,
Mark



 Can't start spark shell or pyspark in Windows 7
 ---

 Key: SPARK-7195
 URL: https://issues.apache.org/jira/browse/SPARK-7195
 Project: Spark
  Issue Type: Bug
  Components: PySpark, Spark Shell
Affects Versions: 1.3.1
 Environment: Windows 7, Java 8 (1.8.0_31) or Java 7 (1.7.0_79), Scala 
 2.11.6, Python 2.7
Reporter: Mark Smiley
 Attachments: spark_bug.png


 cd\spark\bin dir
 spark-shell
 yields following error:
 find: 'version': No such file or directory
 else was unexpected at this time
 Same error with 
 spark-shell2.cmd
 PyShell starts but with errors and doesn't work properly once started
 (e.g., can't find sc). Can send screenshot of errors on request.
 Using Spark 1.3.1 for Hadoop 2.6 binary
 Note: Hadoop not installed on machine.
 Scala works by itself, Python works by itself
 Java works fine (I use it all the time)
 Based on another comment, tried Java 7 (1.7.0_79), but it made no difference 
 (same error).
 JAVA_HOME = C:\jdk1.8.0\bin
 C:\jdk1.8.0\bin\;C:\Program Files 
 (x86)\scala\bin;C:\Python27;c:\Rtools\bin;c:\Rtools\gcc-4.6.3\bin;C:\Oracle\product64\12.1.0\client_1\bin;C:\Oracle\product\12.1.0\client_1\bin;C:\ProgramData\Oracle\Java\javapath;C:\Program
  Files (x86)\Intel\iCLS Client\;C:\Program Files\Intel\iCLS 
 Client\;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\Program
  Files\Intel\Intel(R) Management Engine Components\DAL;C:\Program 
 Files\Intel\Intel(R) Management Engine Components\IPT;C:\Program Files 
 (x86)\Intel\Intel(R) Management Engine Components\DAL;C:\Program Files 
 (x86)\Intel\Intel(R) Management Engine Components\IPT;C:\Program 
 Files\Dell\Dell Data Protection\Access\Advanced\Wave\Gemalto\Access 
 Client\v5\;C:\Program Files (x86)\NTRU Cryptosystems\NTRU TCG Software 
 Stack\bin\;C:\Program Files\NTRU Cryptosystems\NTRU TCG Software 
 Stack\bin\;C:\Program Files (x86)\Intel\OpenCL SDK\2.0\bin\x86;C:\Program 
 Files (x86)\Intel\OpenCL SDK\2.0\bin\x64;C:\Program Files\MiKTeX 
 2.9\miktex\bin\x64\;C:\Program Files 
 (x86)\ActivIdentity\ActivClient\;C:\Program Files\ActivIdentity\ActivClient\



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org