Yes spark-submit adds all this for you. You don't bring Spark classes in
your app

On Thu, Jun 25, 2015, 4:01 PM jimfcarroll <jimfcarr...@gmail.com> wrote:

> Hi Sean,
>
> I'm packaging spark with my (standalone) driver app using maven. Any
> assemblies that are used on the mesos workers through extending the
> classpath or providing the jars in the driver (via the SparkConf) isn't
> packaged with spark (it seems obvious that would be a mistake).
>
> I need, for example, "RDD" on my classpath in order for my driver app to
> run. Are you saying I need to mark spark as provided in maven and include
> an
> installed distribution's lib directory jars on my classpath?
>
> I'm not using anything but the jar files from a Spark install in my driver
> so that seemed superfluous (and slightly more difficult to manage the
> deployment). Also, even if that's the case, I don't understand why the
> maven
> dependency of the same version of a deployable distribution would have
> different versions of classes in it than the deployable version itself.
>
> Thanks for your patience.
> Jim
>
>
>
>
> --
> View this message in context:
> http://apache-spark-developers-list.1001551.n3.nabble.com/Problem-with-version-compatibility-tp12861p12889.html
> Sent from the Apache Spark Developers List mailing list archive at
> Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>

Reply via email to