Ah. I've avoided using spark-submit primarily because our use of Spark is as
part of an analytics library that's meant to be embedded in other
applications with their own lifecycle management.

One of those application is a REST app running in tomcat which will make the
use of spark-submit difficult (if not impossible).

Also, we're trying to avoid sending jars over the wire per-job and so we
install our library (minus the spark dependencies) on the mesos workers and
refer to it in the spark configuration using spark.executor.extraClassPath
and if I'm reading SparkSubmit.scala correctly, it looks like the user's
assembly ends up sent to the cluster (at least in the case of yarn) though I
could be wrong on this.

Is there a standard way of running an app that's in control of it's own
runtime lifecycle without spark-submit?

Thanks again.
Jim




--
View this message in context: 
http://apache-spark-developers-list.1001551.n3.nabble.com/Problem-with-version-compatibility-tp12861p12894.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to