2.4 works with Hadoop 3 (optionally) and Hive 1. I doubt it will work
connecting to Hadoop 3 / Hive 3; it's possible in a few cases.
It's also possible some vendor distributions support this combination.

On Mon, Jul 6, 2020 at 7:51 AM Teja <saiteja.pa...@gmail.com> wrote:
>
> We use spark 2.4.0 to connect to Hadoop 2.7 cluster and query from Hive
> Metastore version 2.3. But the Cluster managing team has decided to upgrade
> to Hadoop 3.x and Hive 3.x. We could not migrate to spark 3 yet, which is
> compatible with Hadoop 3 and Hive 3, as we could not test if anything
> breaks.
>
> *Is there any possible way to stick to spark 2.4.x version and still be able
> to use Hadoop 3 and Hive 3?
> *
>
> I got to know backporting is one option but I am not sure how. It would be
> great if you could point me in that direction.
>
>
>
> --
> Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to