Feiner <sidney.fei...@startapp.com>
Cc: Koert Kuipers <ko...@tresata.com>; user@spark.apache.org
Subject: Re: Jars directory in Spark 2.0
Spark has never shaded dependencies (in the sense of renaming the classes),
with a couple of exceptions (Guava and Jetty). So that behavior is not
idney.fei...@startapp.com>
> *Cc:* user@spark.apache.org
> *Subject:* Re: Jars directory in Spark 2.0
>
>
>
> you basically have to keep your versions of dependencies in line with
> sparks or shade your own dependencies.
>
> you cannot just replace the jars in sparks jars
uipers [mailto:ko...@tresata.com]
Sent: Tuesday, January 31, 2017 7:26 PM
To: Sidney Feiner <sidney.fei...@startapp.com>
Cc: user@spark.apache.org
Subject: Re: Jars directory in Spark 2.0
you basically have to keep your versions of dependencies in line with sparks or
shade your own depend
you basically have to keep your versions of dependencies in line with
sparks or shade your own dependencies.
you cannot just replace the jars in sparks jars folder. if you wan to
update them you have to build spark yourself with updated dependencies and
confirm it compiles, passes tests etc.
On
Hey,
While migrating to Spark 2.X from 1.6, I've had many issues with jars that come
preloaded with Spark in the "jars/" directory and I had to shade most of my
packages.
Can I replace the jars in this folder to more up to date versions? Are those
jar used for anything internal in Spark which