I download the binaries for spark-1.0.2-hadoop1 and unpack it on my Widows
8 box.
I can execute spark-shell.com and get a command window which does the
proper things
I open a browser to http:/localhost:4040 and a window comes up describing
the spark-master

Then using IntelliJ I create a project with JavaWordCount from the spark
distribution. add


When I run the job with the -Dspark.master=spark://local[*]:7707 (I have
tried MANY other string)
the Job fails for failure to connect to the spark master.

So my question is
1) Do I have a spark-master running? How can I tell? doesn't the web page
say it is running
2) How to I find the port on which the master is running and test that it
is accepting jobs
3) Are there other steps I need to take before I can run a simple spark
sample?

14/08/21 09:27:08 INFO scheduler.TaskSchedulerImpl: Adding task set 1.0
with 1 tasks
14/08/21 09:27:23 WARN scheduler.TaskSchedulerImpl: Initial job has not
accepted any resources; check your cluster UI to ensure that workers are
registered and have sufficient memory
...

14/08/21 09:28:08 ERROR cluster.SparkDeploySchedulerBackend: Application
has been killed. Reason: All masters are unresponsive! Giving up.
14/08/21 09:28:08 INFO scheduler.TaskSchedulerImpl: Removed TaskSet 1.0,
whose tasks have all completed, from pool
14/08/21 09:28:08 INFO scheduler.TaskSchedulerImpl: Cancelling stage 1
14/08/21 09:28:08 INFO scheduler.DAGScheduler: Failed to run collect at
JavaWordCount.java:68
Exception in thread "main" org.apache.spark.SparkException: Job aborted due
to stage failure: All masters are unresponsive! Giving up.

Reply via email to