Hi, I propose that for Spark docker images we follow the following convention similar to flink <https://hub.docker.com/_/flink>as shown in the attached file
So for Spark we will have <PRODUCT_VERSION>-<SCALA_VERSION>-<JAVA_VERSION> 3.1.2-scala_2.12-java11 3.1.2_sparkpy-scala_2.12-java11 3.1.2_sparkR-scala_2.12-java11 If this makes sense please respond, otherwise state your preference HTH *Disclaimer:* Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction.
--------------------------------------------------------------------- To unsubscribe e-mail: dev-unsubscr...@spark.apache.org