[ https://issues.apache.org/jira/browse/SPARK-52265?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Emilie Faracci updated SPARK-52265: ----------------------------------- Description: The {{HiveExternalCatalogVersionsSuite}} test "backward compatibility" is not running due to an issue with version parsing logic. The test attempts to retrieve Spark versions for testing from [https://dist.apache.org/repos/dist/release/spark/], but fails because the current regex pattern doesn't properly handle paths like {{spark-connect-swift-0.1.0/}} and {{spark-kubernetes-operator-0.1.0/}} that were recently added. As such, there is a logError {{Exception encountered when invoking run on a nested suite - Fail to get the latest Spark versions to test.}} because {{PROCESS_TABLES.testingVersions}} is empty, and condition to run the test is not met. > Newly introduced artifacts in Spark release causes > HiveExternalCatalogVersionsSuite not to run > ---------------------------------------------------------------------------------------------- > > Key: SPARK-52265 > URL: https://issues.apache.org/jira/browse/SPARK-52265 > Project: Spark > Issue Type: Bug > Components: Tests > Affects Versions: 4.0.0, 4.1.0 > Reporter: Emilie Faracci > Priority: Minor > > The {{HiveExternalCatalogVersionsSuite}} test "backward compatibility" is not > running due to an issue with version parsing logic. The test attempts to > retrieve Spark versions for testing from > [https://dist.apache.org/repos/dist/release/spark/], but fails because the > current regex pattern doesn't properly handle paths like > {{spark-connect-swift-0.1.0/}} and {{spark-kubernetes-operator-0.1.0/}} that > were recently added. As such, there is a logError > {{Exception encountered when invoking run on a nested suite - Fail to get the > latest Spark versions to test.}} > because {{PROCESS_TABLES.testingVersions}} is empty, and condition to run the > test is not met. -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org