Hive 2.0.1 only supports Spark 1.5.0, Hive 2.3.2 only supports Spark 2.0.0

There was probably some backwards incompatible change in Spark from 1.x to
2.x that made it stop working.

https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark%3A+Getting+Started#HiveonSpark:GettingStarted-VersionCompatibility

If there is interest in running newer version of Hive with older versions
of Spark we can consider adding a Spark shims layer to Hive, similar to
what we do to support multiple Hadoop versions.

On Fri, May 4, 2018 at 1:24 AM, Mich Talebzadeh <[email protected]>
wrote:

> Hi,
>
> my Hive 2.0.1 works fine on Spark 1.3.1 engine.
>
> select count(1) from sales;
> Starting Spark Job = dc529d0e-e2d2-431f-8c17-f7867858217f
> Query Hive on Spark job[5] stages:
> 10
> 11
> Status: Running (Hive on Spark job[5])
>
> However, once I upgraded Hive to version 2.3.2, it stopped working!
>
> Any ideas what has changed?
>
> Thanks
>
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * 
> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>
>
>
> http://talebzadehmich.wordpress.com
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>



-- 
Sahil Takiar
Software Engineer
[email protected] | (510) 673-0309

Reply via email to