I meet the same problem,it seems JavaSparkListener has been delete in  spark 2. 
But I see someone using hive 1.2.1 with spark 2 is ok. I haven't try yet. 




------------------ ???????? ------------------
??????: "Stephen Sprague"<sprag...@gmail.com>; 
????????: 2017??3??18??(??????) ????2:33
??????: "user@hive.apache.org"<user@hive.apache.org>; 
????: Re: hive on spark - version question



:(  gettin' no love on this one.   any SME's know if Spark 2.1.0 will work with 
Hive 2.1.0 ?  That JavaSparkListener class looks like a deal breaker to me, 
alas.


thanks in advance.


Cheers,

Stephen.



On Mon, Mar 13, 2017 at 10:32 PM, Stephen Sprague <sprag...@gmail.com> wrote:
hi guys,

wondering where we stand with Hive On Spark these days?


i'm trying to run Spark 2.1.0 with Hive 2.1.0 (purely coincidental versions) 
and running up against this class not found:

java.lang.NoClassDefFoundError: org/apache/spark/JavaSparkListener



searching the Cyber i find this:
    1. 
http://stackoverflow.com/questions/41953688/setting-spark-as-default-execution-engine-for-hive


    which pretty much describes my situation too and it references this:



    2. https://issues.apache.org/jira/browse/SPARK-17563



    which indicates a "won't fix" - but does reference this:



    3. https://issues.apache.org/jira/browse/HIVE-14029


    which looks to be fixed in hive 2.2 - which is not released yet.



so if i want to use spark 2.1.0 with hive am i out of luck - until hive 2.2?


thanks,

Stephen.

Reply via email to