Thanks Mich
To be sure, are you really saying that, using the option
"spark.yarn.archive", YOU have been able to OVERRIDE installed Spark JAR
with the JAR given with the option "spark.yarn.archive" ?
No more than "spark.yarn.archive" ?
Thanks
Dominique
Le jeu. 12 nov. 2020 à 18:01, Mich
Thanks Russell
> Since the driver is responsible for moving jars specified in --jars, you
cannot use a jar specified by --jars to be in driver-class-path, since the
driver is already started and it's classpath is already set before any jars
are moved.
Your point is interesting, however I see
As I understand Spark expects the jar files to be available on all nodes or
if applicable on HDFS directory
Putting Spark Jar files on HDFS
In Yarn mode, *it is important that Spark jar files are available
throughout the Spark cluster*. I have spent a fair bit of time on this and
I recommend
--driver-class-path does not move jars, so it is dependent on your Spark
resource manager (master). It is interpreted literally so if your files do
not exist in the location you provide relative where the driver is run,
they will not be placed on the classpath.
Since the driver is responsible for
Hi,
I am using Spark 2.1 (BTW) on YARN.
I am trying to upload JAR on YARN cluster, and to use them to replace
on-site (alreading in-place) JAR.
I am trying to do so through spark-submit.
One helpful answer