+1 (non-binding)

Beyond the typical signature/license/build checks, I did tests locally on
each Spark+Scala combination and validated that these artifacts are OK. I
downloaded the spark artifacts from https://spark.apache.org/downloads.html
(note you'll want the separate Spark artifacts which were prebuilt with
Scala 2.13 when validating that). Spark 3.3 is no longer in the drop down
on the downloads site but I downloaded 3.3.4 built with 2.12 from here
<https://www.apache.org/dyn/closer.lua/spark/spark-3.3.4/spark-3.3.4-bin-hadoop3.tgz>
and for 2.13 from here
<https://www.apache.org/dyn/closer.lua/spark/spark-3.3.4/spark-3.3.4-bin-hadoop3-scala2.13.tgz>
.

I encourage folks to try out the staged Spark artifacts in their own test
environments or CI if possible when voting, that would also give us some
more confidence.

Thanks,

Amogh Jahagirdar

Reply via email to