+1 from me, with a few comments.

I saw the following failures, are these known issues/flakey tests ?

* PersistenceEngineSuite.ZooKeeperPersistenceEngine
Looks like a port conflict issue from a quick look into logs (conflict with
starting admin port at 8080) - is this expected behavior for the test ?
I worked around it by shutting down the process which was using the port -
though did not investigate deeply.

* org.apache.spark.sql.hive.HiveExternalCatalogVersionsSuite was aborted
It is expecting these artifacts in $HOME/.m2/repository

1. tomcat#jasper-compiler;5.5.23!jasper-compiler.jar
2. tomcat#jasper-runtime;5.5.23!jasper-runtime.jar
3. commons-el#commons-el;1.0!commons-el.jar
4. org.apache.hive#hive-exec;2.3.7!hive-exec.jar

I worked around it by adding them locally explicitly - we should probably
add them as test dependency ?
Not sure if this changed in this release though (I had cleaned my local .m2
recently)

Other than this, rest looks good to me.

Regards,
Mridul


On Wed, Sep 28, 2022 at 2:56 PM Sean Owen <sro...@apache.org> wrote:

> +1 from me, same result as last RC.
>
> On Wed, Sep 28, 2022 at 12:21 AM Yuming Wang <wgy...@gmail.com> wrote:
>
>> Please vote on releasing the following candidate as Apache Spark version 
>> 3.3.1.
>>
>> The vote is open until 11:59pm Pacific time October 3th and passes if a 
>> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>>
>> [ ] +1 Release this package as Apache Spark 3.3.1
>> [ ] -1 Do not release this package because ...
>>
>> To learn more about Apache Spark, please see https://spark.apache.org
>>
>> The tag to be voted on is v3.3.1-rc2 (commit 
>> 1d3b8f7cb15283a1e37ecada6d751e17f30647ce):
>> https://github.com/apache/spark/tree/v3.3.1-rc2
>>
>> The release files, including signatures, digests, etc. can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.3.1-rc2-bin
>>
>> Signatures used for Spark RCs can be found in this file:
>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>
>> The staging repository for this release can be found at:
>> https://repository.apache.org/content/repositories/orgapachespark-1421
>>
>> The documentation corresponding to this release can be found at:
>> https://dist.apache.org/repos/dist/dev/spark/v3.3.1-rc2-docs
>>
>> The list of bug fixes going into 3.3.1 can be found at the following URL:
>> https://issues.apache.org/jira/projects/SPARK/versions/12351710
>>
>> This release is using the release script of the tag v3.3.1-rc2.
>>
>>
>> FAQ
>>
>> =========================
>> How can I help test this release?
>> =========================
>> If you are a Spark user, you can help us test this release by taking
>> an existing Spark workload and running on this release candidate, then
>> reporting any regressions.
>>
>> If you're working in PySpark you can set up a virtual env and install
>> the current RC and see if anything important breaks, in the Java/Scala
>> you can add the staging repository to your projects resolvers and test
>> with the RC (make sure to clean up the artifact cache before/after so
>> you don't end up building with a out of date RC going forward).
>>
>> ===========================================
>> What should happen to JIRA tickets still targeting 3.3.1?
>> ===========================================
>> The current list of open tickets targeted at 3.3.1 can be found at:
>> https://issues.apache.org/jira/projects/SPARK and search for "Target 
>> Version/s" = 3.3.1
>>
>> Committers should look at those and triage. Extremely important bug
>> fixes, documentation, and API tweaks that impact compatibility should
>> be worked on immediately. Everything else please retarget to an
>> appropriate release.
>>
>> ==================
>> But my bug isn't fixed?
>> ==================
>> In order to make timely releases, we will typically not hold the
>> release unless the bug in question is a regression from the previous
>> release. That being said, if there is something which is a regression
>> that has not been correctly targeted please ping me or a committer to
>> help target the issue.
>>
>>
>>
>>

Reply via email to