Greetings,

Hadoop 2.6 has been removed according to this ticket
https://issues.apache.org/jira/browse/SPARK-25016

We run our Spark cluster on K8s in standalone mode.
We access HDFS/Hive running on a Hadoop 2.6 cluster.
We've been using Spark 2.4.5 and planning on upgrading to Spark 3.0.0
However, we dont have any control over the Hadoop cluster and it will
remain in 2.6

Is Spark 3.0 still compatible with HDFS/Hive running on Hadoop 2.6 ?

Best Regards,

Reply via email to