RE: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

2015-12-03 Thread Mich Talebzadeh
any responsibility. -Original Message- From: Marcelo Vanzin [mailto:van...@cloudera.com] Sent: 03 December 2015 18:45 To: Mich Talebzadeh <m...@peridale.co.uk> Cc: u...@hive.apache.org; user <user@spark.apache.org> Subject: Re: Any clue on this error, Exception in thread &q

Re: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

2015-12-03 Thread Marcelo Vanzin
(bcc: user@spark, since this is Hive code.) You're probably including unneeded Spark jars in Hive's classpath somehow. Either the whole assembly or spark-hive, both of which will contain Hive classes, and in this case contain old versions that conflict with the version of Hive you're running. On

RE: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

2015-12-03 Thread Mich Talebzadeh
nt: 03 December 2015 18:07 To: u...@hive.apache.org Cc: user@spark.apache.org Subject: Re: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT maybe you compile and run against different versions of spark? On Thu, Dec 3,

Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

2015-12-03 Thread Mich Talebzadeh
Trying to run Hive on Spark 1.3 engine, I get conf hive.spark.client.channel.log.level=null --conf hive.spark.client.rpc.max.size=52428800 --conf hive.spark.client.rpc.threads=8 --conf hive.spark.client.secret.bits=256 15/12/03 17:53:18 [stderr-redir-1]: INFO client.SparkClientImpl: Spark

Re: Any clue on this error, Exception in thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

2015-12-03 Thread Marcelo Vanzin
On Thu, Dec 3, 2015 at 10:32 AM, Mich Talebzadeh wrote: > hduser@rhes564::/usr/lib/spark/logs> hive --version > SLF4J: Found binding in > [jar:file:/usr/lib/spark/lib/spark-assembly-1.3.0-hadoop2.4.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] As I suggested before, you