Thanks I tried all :(

 

I am trying to make Hive use Spark and apparently Hive can use version 1.3 of 
Spark as execution engine. Frankly I don’t know why this is not working!

 

Mich Talebzadeh

 

Sybase ASE 15 Gold Medal Award 2008

A Winning Strategy: Running the most Critical Financial Data on ASE 15

http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf

Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15", 
ISBN 978-0-9563693-0-7. 

co-author "Sybase Transact SQL Guidelines Best Practices", ISBN 
978-0-9759693-0-4

Publications due shortly:

Complex Event Processing in Heterogeneous Environments, ISBN: 978-0-9563693-3-8

Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume one 
out shortly

 

http://talebzadehmich.wordpress.com <http://talebzadehmich.wordpress.com/> 

 

NOTE: The information in this email is proprietary and confidential. This 
message is for the designated recipient only, if you are not the intended 
recipient, you should destroy it immediately. Any information in this message 
shall not be understood as given or endorsed by Peridale Technology Ltd, its 
subsidiaries or their employees, unless expressly so stated. It is the 
responsibility of the recipient to ensure that this email is virus free, 
therefore neither Peridale Ltd, its subsidiaries nor their employees accept any 
responsibility.

 

From: Furcy Pin [mailto:furcy....@flaminem.com] 
Sent: 03 December 2015 18:07
To: u...@hive.apache.org
Cc: user@spark.apache.org
Subject: Re: Any clue on this error, Exception in thread "main" 
java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

 

maybe you compile and run against different versions of spark?

 

On Thu, Dec 3, 2015 at 6:54 PM, Mich Talebzadeh <m...@peridale.co.uk 
<mailto:m...@peridale.co.uk> > wrote:

Trying to run Hive on Spark 1.3 engine, I get

 

conf hive.spark.client.channel.log.level=null --conf 
hive.spark.client.rpc.max.size=52428800 --conf hive.spark.client.rpc.threads=8 
--conf hive.spark.client.secret.bits=256

15/12/03 17:53:18 [stderr-redir-1]: INFO client.SparkClientImpl: Spark assembly 
has been built with Hive, including Datanucleus jars on classpath

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
Ignoring non-spark config property: hive.spark.client.connect.timeout=1000

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
Ignoring non-spark config property: hive.spark.client.rpc.threads=8

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
Ignoring non-spark config property: hive.spark.client.rpc.max.size=52428800

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
Ignoring non-spark config property: hive.spark.client.secret.bits=256

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: 
Ignoring non-spark config property: 
hive.spark.client.server.connect.timeout=90000

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: 15/12/03 
17:53:19 INFO client.RemoteDriver: Connecting to: rhes564:36577

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Exception in 
thread "main" java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at 
org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:46)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at 
org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:146)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at 
org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:556)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at 
java.lang.reflect.Method.invoke(Method.java:606)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at 
org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at 
org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at 
org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at 
org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)

15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl:        at 
org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

 

Any clues?

 

 

Mich Talebzadeh

 

Sybase ASE 15 Gold Medal Award 2008

A Winning Strategy: Running the most Critical Financial Data on ASE 15

http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf

Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15", 
ISBN 978-0-9563693-0-7. 

co-author "Sybase Transact SQL Guidelines Best Practices", ISBN 
978-0-9759693-0-4

Publications due shortly:

Complex Event Processing in Heterogeneous Environments, ISBN: 978-0-9563693-3-8

Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume one 
out shortly

 

http://talebzadehmich.wordpress.com <http://talebzadehmich.wordpress.com/> 

 

NOTE: The information in this email is proprietary and confidential. This 
message is for the designated recipient only, if you are not the intended 
recipient, you should destroy it immediately. Any information in this message 
shall not be understood as given or endorsed by Peridale Technology Ltd, its 
subsidiaries or their employees, unless expressly so stated. It is the 
responsibility of the recipient to ensure that this email is virus free, 
therefore neither Peridale Ltd, its subsidiaries nor their employees accept any 
responsibility.

 

 

Reply via email to