Hi Corey,

When you run on Yarn, Yarn's libraries are placed in the classpath,
and they have precedence over your app's. So, with Spark 1.2, you'll
get Guava 11 in your classpath (with Spark 1.1 and earlier you'd get
Guava 14 from Spark, so still a problem for you).

Right now, the option Markus mentioned
(spark.yarn.user.classpath.first) can be a workaround for you, since
it will place your app's jars before Yarn's on the classpath.


On Tue, Feb 3, 2015 at 8:20 PM, Corey Nolet <cjno...@gmail.com> wrote:
> I'm having a really bad dependency conflict right now with Guava versions
> between my Spark application in Yarn and (I believe) Hadoop's version.
>
> The problem is, my driver has the version of Guava which my application is
> expecting (15.0) while it appears the Spark executors that are working on my
> RDDs have a much older version (assuming it's the old version on the Hadoop
> classpath).
>
> Is there a property like "mapreduce.job.user.classpath.first' that I can set
> to make sure my own classpath is extablished first on the executors?



-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to