sorry on Yarn only but I gather it should work with Mesos. I don't think
that comes into it.

The issue is the compatibility of Spark assembly library with Hive.

HTH

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.



On 15 September 2016 at 22:41, John Omernik <j...@omernik.com> wrote:

> Did you run it on Mesos? Your presentation doesn't mention Mesos at all...
>
> John
>
>
> On Thu, Sep 15, 2016 at 4:20 PM, Mich Talebzadeh <
> mich.talebza...@gmail.com> wrote:
>
>> Yes you can. Hive on Spark meaning Hive using Spark as its execution
>> engine works fine. The version that I managed to make it work  is any Hive
>> version> 1,2 with Spark 1.3.1.
>>
>> You  need to build Spark from the source code excluding Hive libraries.
>>
>> Check my attached presentation.
>>
>>  HTH
>>
>>
>>
>> Dr Mich Talebzadeh
>>
>>
>>
>> LinkedIn * 
>> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>>
>>
>>
>> http://talebzadehmich.wordpress.com
>>
>>
>> *Disclaimer:* Use it at your own risk. Any and all responsibility for
>> any loss, damage or destruction of data or any other property which may
>> arise from relying on this email's technical content is explicitly
>> disclaimed. The author will in no case be liable for any monetary damages
>> arising from such loss, damage or destruction.
>>
>>
>>
>> On 15 September 2016 at 22:10, John Omernik <j...@omernik.com> wrote:
>>
>>> Hey all, I was experimenting with some bleeding edge Hive.  (2.1) and
>>> trying to get it to run on bleeding edge Spark (2.0).
>>>
>>> Spark is working fine, I can query the data all is setup, however, I
>>> can't get Hive on Spark to work. I understand it's not really a thing (Hive
>>> on Spark on Mesos) but I am thinking... why not?  Thus I am posting here.
>>> (I.e. is there some reason why this shouldn't work other than it just
>>> hasn't been attempted?)
>>>
>>> The error I am getting is odd.. (see below) not sure why that would pop
>>> up, everything seems right other wise... any help would be appreciated.
>>>
>>> John
>>>
>>>
>>>
>>>
>>> at java.lang.ClassLoader.defineClass1(Native Method)
>>>
>>> at java.lang.ClassLoader.defineClass(ClassLoader.java:760)
>>>
>>> at java.security.SecureClassLoader.defineClass(SecureClassLoade
>>> r.java:142)
>>>
>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
>>>
>>> at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
>>>
>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
>>>
>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
>>>
>>> at java.security.AccessController.doPrivileged(Native Method)
>>>
>>> at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
>>>
>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>>
>>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>>>
>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>>
>>> at java.lang.Class.forName0(Native Method)
>>>
>>> at java.lang.Class.forName(Class.java:348)
>>>
>>> at org.apache.spark.util.Utils$.classForName(Utils.scala:225)
>>>
>>> at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy
>>> $SparkSubmit$$runMain(SparkSubmit.scala:686)
>>>
>>> at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit
>>> .scala:185)
>>>
>>> at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
>>>
>>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
>>>
>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>
>>> Caused by: java.lang.ClassNotFoundException:
>>> org.apache.spark.JavaSparkListener
>>>
>>> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>>>
>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>>
>>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
>>>
>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>>
>>> ... 20 more
>>>
>>>
>>> at org.apache.hive.spark.client.rpc.RpcServer.cancelClient(RpcS
>>> erver.java:179)
>>>
>>> at org.apache.hive.spark.client.SparkClientImpl$3.run(SparkClie
>>> ntImpl.java:465)
>>>
>>
>>
>

Reply via email to