Failed to locate the winutils binary in the hadoop binary path

2015-01-29 Thread Naveen Kumar Pokala
Hi, I am facing the following issue when I am connecting from spark-shell. Please tell me how to avoid it. 15/01/29 17:21:27 ERROR Shell: Failed to locate the winutils binary in the hadoop binary path java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop

Re: Failed to locate the winutils binary in the hadoop binary path

2015-01-29 Thread Akhil Das
am connecting from spark-shell. Please tell me how to avoid it. 15/01/29 17:21:27 ERROR Shell: Failed to locate the winutils binary in the hadoop binary path java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries

winutils

2014-10-29 Thread Ron Ayoub
Apparently Spark does require Hadoop even if you do not intend to use Hadoop. Is there a workaround for the below error I get when creating the SparkContext in Scala? I will note that I didn't have this problem yesterday when creating the Spark context in Java as part of the getting started

Re: winutils

2014-10-29 Thread Denny Lee
QQ - did you download the Spark 1.1 binaries that included the Hadoop one? Does this happen if you're using the Spark 1.1 binaries that do not include the Hadoop jars? On Wed, Oct 29, 2014 at 11:31 AM, Ron Ayoub ronalday...@live.com wrote: Apparently Spark does require Hadoop even if you do not

RE: winutils

2014-10-29 Thread Ron Ayoub
, 29 Oct 2014 11:38:23 -0700 Subject: Re: winutils From: denny.g@gmail.com To: ronalday...@live.com CC: user@spark.apache.org QQ - did you download the Spark 1.1 binaries that included the Hadoop one? Does this happen if you're using the Spark 1.1 binaries that do not include the Hadoop jars

RE: winutils

2014-10-29 Thread Ron Ayoub
, 29 Oct 2014 11:38:23 -0700 Subject: Re: winutils From: denny.g@gmail.com To: ronalday...@live.com CC: user@spark.apache.org QQ - did you download the Spark 1.1 binaries that included the Hadoop one? Does this happen if you're using the Spark 1.1 binaries that do not include the Hadoop jars

Re: winutils

2014-10-29 Thread Sean Owen
cf. https://issues.apache.org/jira/browse/SPARK-2356 On Wed, Oct 29, 2014 at 7:31 PM, Ron Ayoub ronalday...@live.com wrote: Apparently Spark does require Hadoop even if you do not intend to use Hadoop. Is there a workaround for the below error I get when creating the SparkContext in Scala? I