Thanks hbogert. There it is plain as day; it can't find my spark binaries.
I thought it was enough to set SPARK_EXECUTOR_URI in my spark-env.sh since
this is all that's necessary to run spark-shell.sh against a mesos master,
but I also had to set spark.executor.uri in my spark-defaults.conf (or
Thanks for the response. I'll admit I'm rather new to Mesos. Due to the
nature of my setup I can't use the Mesos web portal effectively because I'm
not connected by VPN, so the local network links from the mesos-master
dashboard I SSH tunnelled aren't working.
Anyway, I was able to dig up some
Mesosphere did a great job on simplifying the process of running Spark on
Mesos. I am using this guide to setup a development Mesos cluster on Google
Cloud Compute.
https://mesosphere.com/docs/tutorials/run-spark-on-mesos/
I can run the example that's in the guide by using spark-shell (finding
The latter part of this question where I try to submit the application by
referring to it on HDFS is very similar to the recent question
Spark-submit not working when application jar is in hdfs