hi guys,
wondering where we stand with Hive On Spark these days?

i'm trying to run Spark 2.1.0 with Hive 2.1.0 (purely coincidental
versions) and running up against this class not found:

java.lang.NoClassDefFoundError: org/apache/spark/JavaSparkListener


searching the Cyber i find this:
    1.
http://stackoverflow.com/questions/41953688/setting-spark-as-default-execution-engine-for-hive

    which pretty much describes my situation too and it references this:


    2. https://issues.apache.org/jira/browse/SPARK-17563

    which indicates a "won't fix" - but does reference this:


    3. https://issues.apache.org/jira/browse/HIVE-14029

    which looks to be fixed in hive 2.2 - which is not released yet.


so if i want to use spark 2.1.0 with hive am i out of luck - until hive 2.2?

thanks,
Stephen.

Reply via email to