Well. I got past this problem and the manner was in my own email. I did 
download the one with Hadoop since it was among the only ones you don't have to 
compile from source along with CDH and Map. It worked yesterday because I added 
1.1.0 as a maven dependency from the repository. I just did the same thing 
again and it worked perfect. 
One peculiarity I will mention is that even with Scala IDE installed in Eclipse 
when I created the Maven project per instructions on the web and installed the 
connector I still did not get the Scala perspective nor right clicking and 
being able to add Scala types. This time around, I used the Scala IDE project 
wizard to create a simple non-Maven app and then converted it to Maven and all 
features seem to work fine.
I will also note that I'm learning Java, Scala, Eclipse, Spark, Maven all at 
the same time. Kind of overkill. But part of the frustration was following 
along with the Maven Scala project instructions using an archetype badly out of 
date. So now I think I found the a good approach to getting up and running with 
spark (1. Eclipse, 2. Scala IDE, 3. Scala Wizard Project, 4. Convert to Maven, 
5. Add Spark dependency). 

Date: Wed, 29 Oct 2014 11:38:23 -0700
Subject: Re: winutils
From: denny.g....@gmail.com
To: ronalday...@live.com
CC: user@spark.apache.org

QQ - did you download the Spark 1.1 binaries that included the Hadoop one?  
Does this happen if you're using the Spark 1.1 binaries that do not include the 
Hadoop jars?
On Wed, Oct 29, 2014 at 11:31 AM, Ron Ayoub <ronalday...@live.com> wrote:



Apparently Spark does require Hadoop even if you do not intend to use Hadoop. 
Is there a workaround for the below error I get when creating the SparkContext 
in Scala?
I will note that I didn't have this problem yesterday when creating the Spark 
context in Java as part of the getting started App. It could be because I was 
using Maven project to manage dependencies and that did something for me or 
else JavaSparkContext has some different code. 
I would say, in order for Spark to be general purpose this is a pretty big bug 
since now it appears Spark depends upon Hadoop. 
"Could not locate executable null\bin\winutils.exe in the Hadoop binaries"

                                          

                                          

Reply via email to