Actually I would say yes and no. Yes means the jar will be fetched by executor and added to classpath, No means it would not be added to classpath of driver. That means you can not invoke the class in the jar explicitly. But you can call them indirectly. like following (or if the jar is only dependency, won't be called directly )
>>> rdd.map(e=>{Class.forName("com.zjffdu.tutorial.spark.java.MyStack"); e}).collect() On Sat, Dec 19, 2015 at 5:47 AM, Jim Lohse <j...@megalearningllc.com> wrote: > I am going to say no, but have not actually tested this. Just going on > this line in the docs: > > http://spark.apache.org/docs/latest/configuration.html > > spark.driver.extraClassPath (none) Extra classpath entries to prepend to > the classpath of the driver. > *Note:* In client mode, this config must not be set through the SparkConf > directly in your application, because the driver JVM has already started at > that point. Instead, please set this through the --driver-class-path > command line option or in your default properties file. > > > > On 12/17/2015 07:53 AM, amarouni wrote: > > Hello guys, > > Do you know if the method SparkContext.addJar("file:///...") can be used > on a running context (an already started spark-shell) ? > And if so, does it add the jar to the class-path of the Spark workers > (Yarn containers in case of yarn-client) ? > > Thanks, > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > > > -- Best Regards Jeff Zhang