cf. https://issues.apache.org/jira/browse/SPARK-2356

On Wed, Oct 29, 2014 at 7:31 PM, Ron Ayoub <ronalday...@live.com> wrote:
> Apparently Spark does require Hadoop even if you do not intend to use
> Hadoop. Is there a workaround for the below error I get when creating the
> SparkContext in Scala?
>
> I will note that I didn't have this problem yesterday when creating the
> Spark context in Java as part of the getting started App. It could be
> because I was using Maven project to manage dependencies and that did
> something for me or else JavaSparkContext has some different code.
>
> I would say, in order for Spark to be general purpose this is a pretty big
> bug since now it appears Spark depends upon Hadoop.
>
> "Could not locate executable null\bin\winutils.exe in the Hadoop binaries"
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to