On 23 May 2016, at 06:32, Todd <bit1...@163.com<mailto:bit1...@163.com>> wrote:


Can someone please take alook at my question?I am spark-shell local mode and 
yarn-client mode.Spark code uses guava library,spark should have guava in place 
during run time.

Thanks.



At 2016-05-23 11:48:58, "Todd" <bit1...@163.com<mailto:bit1...@163.com>> wrote:
Hi,
In the spark code, guava maven dependency scope is provided, my question is, 
how spark depends on guava during runtime? I looked into the 
spark-assembly-1.6.1-hadoop2.6.1.jar,and didn't find class entries like 
com.google.common.base.Preconditions etc...

Spark "shades" guava on import into the assembly; the libaries are moved to a 
new package and all internal references with it.

This is because guava is a nightmare of backwards compatibility.

if you want guava, decide which version you want and ask for it explicitly

Reply via email to