I have recently encountered a similar problem with Guava version collision
with Hadoop.
Isn't it more correct to upgrade Hadoop to use the latest Guava? Why are
they staying in version 11, does anyone know?
*Romi Kuntsman*, *Big Data Engineer*
http://www.totango.com
On Wed, Jan 7, 2015 at 7:59
Actually there is already someone on Hadoop-Common-Dev taking care of
removing the old Guava dependency
http://mail-archives.apache.org/mod_mbox/hadoop-common-dev/201501.mbox/browser
https://issues.apache.org/jira/browse/HADOOP-11470
*Romi Kuntsman*, *Big Data Engineer*
http://www.totango.com
Please see this thread:
http://search-hadoop.com/m/LgpTk2aVYgr/Hadoop+guava+upgradesubj=Re+Time+to+address+the+Guava+version+problem
On Jan 19, 2015, at 6:03 AM, Romi Kuntsman r...@totango.com wrote:
I have recently encountered a similar problem with Guava version collision
with Hadoop.
-dev
Guava was not downgraded to 11. That PR was not merged. It was part of a
discussion about, indeed, what to do about potential Guava version
conflicts. Spark uses Guava, but so does Hadoop, and so do user programs.
Spark uses 14.0.1 in fact:
Hi Sean,
My mistake, Guava 11 dependency came from the hadoop-commons indeed.
I'm running the following simple app in spark 1.2.0 standalone local
cluster (2 workers) with Hadoop 1.2.1
public class AvroSparkTest {
public static void main(String[] args) throws Exception {
SparkConf
Oh, are you actually bundling Hadoop in your app? that may be the problem.
If you're using stand-alone mode, why include Hadoop? In any event, Spark
and Hadoop are intended to be 'provided' dependencies in the app you send
to spark-submit.
On Tue, Jan 6, 2015 at 10:15 AM, Niranda Perera
Hi,
I have been running a simple Spark app on a local spark cluster and I came
across this error.
Exception in thread main java.lang.NoSuchMethodError:
com.google.common.hash.HashFunction.hashInt(I)Lcom/google/common/hash/HashCode;
at org.apache.spark.util.collection.OpenHashSet.org