Hi, Spark community,

I noticed that the Spark JDBC connector MySQL dialect is testing against the 
8.3.0[1] now, a non-LTS version.

MySQL changed the version policy recently[2], which is now very similar to the 
Java version policy. In short, 5.5, 5.6, 5.7, 8.0 is the LTS version, 8.1, 8.2, 
8.3 is non-LTS, and the next LTS version is 8.4. 

I would say that MySQL is one of the most important infrastructures today, I 
checked the AWS RDS MySQL[4] and Azure Database for MySQL[5] version support 
policy, and both only support 5.7 and 8.0.

Also, Spark officially only supports LTS Java versions, like JDK 17 and 21, but 
not 22. I would recommend using MySQL 8.0 for testing until the next MySQL LTS 
version (8.4) is available.

Additional discussion can be found at [3]

[1] https://issues.apache.org/jira/browse/SPARK-47453
[2] 
https://dev.mysql.com/blog-archive/introducing-mysql-innovation-and-long-term-support-lts-versions/
[3] https://github.com/apache/spark/pull/45581
[4] https://aws.amazon.com/rds/mysql/
[5] https://learn.microsoft.com/en-us/azure/mysql/concepts-version-policy

Thanks,
Cheng Pan



---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to