not sure I understand why unifying how you submit app for different
platforms and dynamic configuration cannot be part of SparkConf and
SparkContext?

for classpath a simple script similar to "hadoop classpath" that shows what
needs to be added should be sufficient.

on spark standalone I can launch a program just fine with just SparkConf
and SparkContext. not on yarn, so the spark-launch script must be doing a
few things extra there I am missing... which makes things more difficult
because I am not sure its realistic to expect every application that needs
to run something on spark to be launched using spark-submit.
 On Jul 9, 2014 3:45 AM, "Patrick Wendell" <pwend...@gmail.com> wrote:

> It fulfills a few different functions. The main one is giving users a
> way to inject Spark as a runtime dependency separately from their
> program and make sure they get exactly the right version of Spark. So
> a user can bundle an application and then use spark-submit to send it
> to different types of clusters (or using different versions of Spark).
>
> It also unifies the way you bundle and submit an app for Yarn, Mesos,
> etc... this was something that became very fragmented over time before
> this was added.
>
> Another feature is allowing users to set configuration values
> dynamically rather than compile them inside of their program. That's
> the one you mention here. You can choose to use this feature or not.
> If you know your configs are not going to change, then you don't need
> to set them with spark-submit.
>
>
> On Wed, Jul 9, 2014 at 10:22 AM, Robert James <srobertja...@gmail.com>
> wrote:
> > What is the purpose of spark-submit? Does it do anything outside of
> > the standard val conf = new SparkConf ... val sc = new SparkContext
> > ... ?
>

Reply via email to