https://issues.apache.org/jira/browse/SPARK-2356

Take a look through the comments, there are some workarounds listed there.

On Wed, Jan 28, 2015 at 1:40 PM, Wang, Ningjun (LNG-NPV)
<ningjun.w...@lexisnexis.com> wrote:
> Has anybody successfully install and run spark-1.2.0 on windows 2008 R2 or
> windows 7? How do you get that works?
>
>
>
> Regards,
>
>
>
> Ningjun Wang
>
> Consulting Software Engineer
>
> LexisNexis
>
> 121 Chanlon Road
>
> New Providence, NJ 07974-1541
>
>
>
> From: Wang, Ningjun (LNG-NPV) [mailto:ningjun.w...@lexisnexis.com]
> Sent: Tuesday, January 27, 2015 10:28 PM
> To: user@spark.apache.org
> Subject: Spark on Windows 2008 R2 serv er does not work
>
>
>
> I download and install  spark-1.2.0-bin-hadoop2.4.tgz pre-built version on
> Windows 2008 R2 server. When I submit a job using spark-submit, I got the
> following error
>
>
>
> WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop
> library for your platform
>
> ... using builtin-java classes where applicable
>
> ERROR org.apache.hadoop.util.Shell: Failed to locate the winutils binary in
> the hadoop binary path
>
> java.io.IOException: Could not locate executable null\bin\winutils.exe in
> the Hadoop binaries.
>
>         at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:318)
>
>         at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:333)
>
>         at org.apache.hadoop.util.Shell.<clinit>(Shell.java:326)
>
>         at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:76)
>
>         at
> org.apache.hadoop.security.Groups.parseStaticMapping(Groups.java:93)
>
>         at org.apache.hadoop.security.Groups.<init>(Groups.java:77)
>
>         at
> org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:240)
>
>         at
> org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:255)
>
>
>
>
>
> Please advise. Thanks.
>
>
>
>
>
> Ningjun
>
>
>
>



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to