Hi Marcelo.

So this is the approach I am going to take:

Use spark 1.3 pre-built
Use Hive 1.2.1. Do not copy over anything to add to hive libraries from spark 
1.3 libraries
Use Hadoop 2.6

There is no need to mess around with the libraries. I will try to unset my 
CLASSPATH and reset again and try again


Thanks,


Mich Talebzadeh

Sybase ASE 15 Gold Medal Award 2008
A Winning Strategy: Running the most Critical Financial Data on ASE 15
http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf
Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15", 
ISBN 978-0-9563693-0-7. 
co-author "Sybase Transact SQL Guidelines Best Practices", ISBN 
978-0-9759693-0-4
Publications due shortly:
Complex Event Processing in Heterogeneous Environments, ISBN: 978-0-9563693-3-8
Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume one 
out shortly

http://talebzadehmich.wordpress.com

NOTE: The information in this email is proprietary and confidential. This 
message is for the designated recipient only, if you are not the intended 
recipient, you should destroy it immediately. Any information in this message 
shall not be understood as given or endorsed by Peridale Technology Ltd, its 
subsidiaries or their employees, unless expressly so stated. It is the 
responsibility of the recipient to ensure that this email is virus free, 
therefore neither Peridale Ltd, its subsidiaries nor their employees accept any 
responsibility.

-----Original Message-----
From: Marcelo Vanzin [mailto:van...@cloudera.com] 
Sent: 03 December 2015 18:45
To: Mich Talebzadeh <m...@peridale.co.uk>
Cc: u...@hive.apache.org; user <user@spark.apache.org>
Subject: Re: Any clue on this error, Exception in thread "main" 
java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

On Thu, Dec 3, 2015 at 10:32 AM, Mich Talebzadeh <m...@peridale.co.uk> wrote:

> hduser@rhes564::/usr/lib/spark/logs> hive --version
> SLF4J: Found binding in
> [jar:file:/usr/lib/spark/lib/spark-assembly-1.3.0-hadoop2.4.0.jar!/org
> /slf4j/impl/StaticLoggerBinder.class]

As I suggested before, you have Spark's assembly in the Hive classpath. That's 
not the way to configure hive-on-spark; if the documentation you're following 
tells you to do that, it's wrong.

(And sorry Ted, but please ignore Ted's suggestion. Hive-on-Spark should work 
fine with Spark 1.3 if it's configured correctly. You really don't want to be 
overriding Hive classes with the ones shipped in the Spark assembly, regardless 
of the version of Spark being used.)

--
Marcelo

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional 
commands, e-mail: user-h...@spark.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to