Hi,
I am facing the following issue when I am connecting from spark-shell. Please
tell me how to avoid it.
15/01/29 17:21:27 ERROR Shell: Failed to locate the winutils binary in the
hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in the
Hadoop
am connecting from spark-shell.
Please tell me how to avoid it.
15/01/29 17:21:27 ERROR Shell: Failed to locate the winutils binary in the
hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in
the Hadoop binaries
Apparently Spark does require Hadoop even if you do not intend to use Hadoop.
Is there a workaround for the below error I get when creating the SparkContext
in Scala?
I will note that I didn't have this problem yesterday when creating the Spark
context in Java as part of the getting started
QQ - did you download the Spark 1.1 binaries that included the Hadoop one?
Does this happen if you're using the Spark 1.1 binaries that do not include
the Hadoop jars?
On Wed, Oct 29, 2014 at 11:31 AM, Ron Ayoub ronalday...@live.com wrote:
Apparently Spark does require Hadoop even if you do not
, 29 Oct 2014 11:38:23 -0700
Subject: Re: winutils
From: denny.g@gmail.com
To: ronalday...@live.com
CC: user@spark.apache.org
QQ - did you download the Spark 1.1 binaries that included the Hadoop one?
Does this happen if you're using the Spark 1.1 binaries that do not include the
Hadoop jars
, 29 Oct 2014 11:38:23 -0700
Subject: Re: winutils
From: denny.g@gmail.com
To: ronalday...@live.com
CC: user@spark.apache.org
QQ - did you download the Spark 1.1 binaries that included the Hadoop one?
Does this happen if you're using the Spark 1.1 binaries that do not include the
Hadoop jars
cf. https://issues.apache.org/jira/browse/SPARK-2356
On Wed, Oct 29, 2014 at 7:31 PM, Ron Ayoub ronalday...@live.com wrote:
Apparently Spark does require Hadoop even if you do not intend to use
Hadoop. Is there a workaround for the below error I get when creating the
SparkContext in Scala?
I