Right now we only support Spark 2.0.0, the issue you are facing is probably
due to a version mismatch issue.

You may find this JIRA useful - SPARK-16292 - you want to make sure you are
building the Spark distribution correctly.

On Mon, Nov 27, 2017 at 8:48 AM, <stephane.d...@orange.com> wrote:

> Hello all
>
>
>
> I’m trying to have Hive running on top of a Spark cluster.
>
> -          Hive version: 2.3.2, installed with the embedded derby
> database (local mode)
>
> -          Spark version: 2.2.0, installed in cluster mode, no yarn, no
> mesos
>
> -          Hadoop version: 2.7.4
>
> -          OS: Redhat 7
>
>
>
> There is something special here, I don’t run it on the top of Hadoop, but
> on the top of Elasticsearch thanks to the Elastic-hadoop bridge. The reason
> why I’m using derby and basic cluster mode for Spark is that I’m currently
> in a kind of discovery phase.
>
> What is working nicely:
>
> -          Spark on ES: I can submit python scripts to query my Elastic db
>
> -          Hive on ES: it works with engine=mr, I’d like to have it with
> engine=spark
>
>
>
> What I can see is that when I launch my Hive query, it seems first normal
> from the Hiveserver point of view:
>
>
>
> 2017-11-27T16:43:08,808  INFO [stderr-redir-1] client.SparkClientImpl: {
>
> 2017-11-27T16:43:08,808  INFO [stderr-redir-1] client.SparkClientImpl:
> "action" : "CreateSubmissionResponse",
>
> 2017-11-27T16:43:08,808  INFO [stderr-redir-1] client.SparkClientImpl:
> "message" : "Driver successfully submitted as driver-20171127164308-0002",
>
> 2017-11-27T16:43:08,808  INFO [stderr-redir-1] client.SparkClientImpl:
> "serverSparkVersion" : "2.2.0",
>
> 2017-11-27T16:43:08,808  INFO [stderr-redir-1] client.SparkClientImpl:
> "submissionId" : "driver-20171127164308-0002",
>
> 2017-11-27T16:43:08,808  INFO [stderr-redir-1] client.SparkClientImpl:
> "success" : true
>
> 2017-11-27T16:43:08,808  INFO [stderr-redir-1] client.SparkClientImpl: }
>
>
>
> But actually, on Spark side, I get the following error :
>
>
>
> Exception in thread "main" java.lang.reflect.InvocationTargetException
>
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>
>         at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:62)
>
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
>
>         at java.lang.reflect.Method.invoke(Method.java:498)
>
>         at org.apache.spark.deploy.worker.DriverWrapper$.main(
> DriverWrapper.scala:58)
>
>         at org.apache.spark.deploy.worker.DriverWrapper.main(
> DriverWrapper.scala)
>
> Caused by: java.lang.NoSuchFieldError: SPARK_RPC_SERVER_ADDRESS
>
>
>
> I’ve set the hive.spark.client.rpc.server.address property on all the
> Spark nodes, where I’ve installed also the Hive binaries and pushed the
> hive-site.xml. I’ve set the HIVE_CONF_DIR and the HIVE_HOME on all nodes
> also, but it doesn’t work.
>
>
>
> I’m a little bit lost now, I don’t see what I could do else L
>
>
>
> Your help is appreciated.
>
>
>
> Thanks a lot,
>
>
>
> Stéphane
>
>
>
> _________________________________________________________________________________________________________________________
>
> Ce message et ses pieces jointes peuvent contenir des informations 
> confidentielles ou privilegiees et ne doivent donc
> pas etre diffuses, exploites ou copies sans autorisation. Si vous avez recu 
> ce message par erreur, veuillez le signaler
> a l'expediteur et le detruire ainsi que les pieces jointes. Les messages 
> electroniques etant susceptibles d'alteration,
> Orange decline toute responsabilite si ce message a ete altere, deforme ou 
> falsifie. Merci.
>
> This message and its attachments may contain confidential or privileged 
> information that may be protected by law;
> they should not be distributed, used or copied without authorisation.
> If you have received this email in error, please notify the sender and delete 
> this message and its attachments.
> As emails may be altered, Orange is not liable for messages that have been 
> modified, changed or falsified.
> Thank you.
>
>


-- 
Sahil Takiar
Software Engineer
takiar.sa...@gmail.com | (510) 673-0309

Reply via email to