Thanks Marcelo - I was using the SBT built spark per earlier thread. I
switched now to the distro (with the conf changes for CDH path in front)
and guava issue is gone.

Thanks,

On Tue, Mar 24, 2015 at 1:50 PM, Marcelo Vanzin <van...@cloudera.com> wrote:

> Hi there,
>
> On Tue, Mar 24, 2015 at 1:40 PM, Manoj Samel <manojsamelt...@gmail.com>
> wrote:
> > When I run any query, it gives java.lang.NoSuchMethodError:
> >
> com.google.common.hash.HashFunction.hashInt(I)Lcom/google/common/hash/HashCode;
>
> Are you running a custom-compiled Spark by any chance? Specifically,
> one you built with sbt? That would hit this problem, because the path
> I suggested (/usr/lib/hadoop/client/*) contains an older guava
> library, which would override the one shipped with the sbt-built
> Spark.
>
> If you build Spark with maven, or use the pre-built Spark distro, or
> specifically filter out the guava jar from your classpath when setting
> up the Spark job, things should work.
>
> --
> Marcelo
>
> --
>
> ---
> You received this message because you are subscribed to the Google Groups
> "CDH Users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to cdh-user+unsubscr...@cloudera.org.
> For more options, visit https://groups.google.com/a/cloudera.org/d/optout.
>

Reply via email to