we shade guava in our fat jar/assembly jar/application jar
On Tue, May 8, 2018 at 12:31 PM, Marcelo Vanzin wrote:
> Using a custom Guava version with Spark is not that simple. Spark
> shades Guava, but a lot of libraries Spark uses do not - the main one
> being all of the
Using a custom Guava version with Spark is not that simple. Spark
shades Guava, but a lot of libraries Spark uses do not - the main one
being all of the Hadoop ones, and they need a quite old Guava.
So you have two options: shade/relocate Guava in your application, or
use
I downgraded to spark 2.0.1 and it fixed that *particular *runtime
exception: but then a similar one appears when saving to parquet:
An SOF question on this was created a month ago and today further details plus
an open bounty were added to it:
I am intermittently running into guava dependency issues across mutiple
spark projects. I have tried maven shade / relocate but it does not
resolve the issues.
The current project is extremely simple: *no* additional dependencies
beyond scala, spark, and scalatest - yet the issues remain (and