if they are tied to the spark context, then why can the subprocess not be started up with the extra jars (sc.addJars) already on class path? this way a switch like user-jars-first would be a simple rearranging of the class path for the subprocess, and the messing with classloaders that is currently done in executor (which forces people to use reflection is certain situations and is broken if you want user jars first) would be history On May 20, 2014 1:07 AM, "Matei Zaharia" <matei.zaha...@gmail.com> wrote:
> They’re tied to the SparkContext (application) that launched them. > > Matei > > On May 19, 2014, at 8:44 PM, Koert Kuipers <ko...@tresata.com> wrote: > > from looking at the source code i see executors run in their own jvm > subprocesses. > > how long to they live for? as long as the worker/slave? or are they tied > to the sparkcontext and life/die with it? > > thx > > >