RE: Jars directory in Spark 2.0

2017-02-01 Thread Sidney Feiner
Feiner <sidney.fei...@startapp.com> Cc: Koert Kuipers <ko...@tresata.com>; user@spark.apache.org Subject: Re: Jars directory in Spark 2.0 Spark has never shaded dependencies (in the sense of renaming the classes), with a couple of exceptions (Guava and Jetty). So that behavior is not

Re: Jars directory in Spark 2.0

2017-02-01 Thread Marcelo Vanzin
idney.fei...@startapp.com> > *Cc:* user@spark.apache.org > *Subject:* Re: Jars directory in Spark 2.0 > > > > you basically have to keep your versions of dependencies in line with > sparks or shade your own dependencies. > > you cannot just replace the jars in sparks jars

RE: Jars directory in Spark 2.0

2017-01-31 Thread Sidney Feiner
uipers [mailto:ko...@tresata.com] Sent: Tuesday, January 31, 2017 7:26 PM To: Sidney Feiner <sidney.fei...@startapp.com> Cc: user@spark.apache.org Subject: Re: Jars directory in Spark 2.0 you basically have to keep your versions of dependencies in line with sparks or shade your own depend

Re: Jars directory in Spark 2.0

2017-01-31 Thread Koert Kuipers
you basically have to keep your versions of dependencies in line with sparks or shade your own dependencies. you cannot just replace the jars in sparks jars folder. if you wan to update them you have to build spark yourself with updated dependencies and confirm it compiles, passes tests etc. On

Jars directory in Spark 2.0

2017-01-31 Thread Sidney Feiner
Hey, While migrating to Spark 2.X from 1.6, I've had many issues with jars that come preloaded with Spark in the "jars/" directory and I had to shade most of my packages. Can I replace the jars in this folder to more up to date versions? Are those jar used for anything internal in Spark which