Hyukjin, you're right that I could have looked more closely. Sorry for that. I definitely should have been more careful.
rb On Fri, May 22, 2020 at 5:19 PM Hyukjin Kwon <gurwls...@gmail.com> wrote: > Ryan, > > > I'm fine with the commit, other than the fact that it violated ASF norms > <https://www.apache.org/foundation/voting.html> to commit without waiting > for a review. > > Looks it became the different proposal as you and other people discussed > and suggested there, which you didn't technically vote for. > It seems reviewed properly by other committers, and I see you were pinged > multiple times. > It might be best to read it carefully before posting it on the RC vote > thread. > > > 2020년 5월 23일 (토) 오전 6:55, 王斐 <cn.feiw...@gmail.com>님이 작성: > >> Hi all, >> Can we help review this pr and resolve this issue before spark-3.0 RC3. >> This is a fault tolerance bug in spark. not as serious as a correctness >> issue, but pretty high up.( I just cite the comment, >> https://github.com/apache/spark/pull/26339#issuecomment-632707720). >> https://issues.apache.org/jira/browse/SPARK-29302 >> https://github.com/apache/spark/pull/26339 >> >> Thanks a lot. >> >> Reynold Xin <r...@databricks.com> 于2020年5月19日周二 上午4:43写道: >> >>> Please vote on releasing the following candidate as Apache Spark version >>> 3.0.0. >>> >>> The vote is open until Thu May 21 11:59pm Pacific time and passes if a >>> majority +1 PMC votes are cast, with a minimum of 3 +1 votes. >>> >>> [ ] +1 Release this package as Apache Spark 3.0.0 >>> [ ] -1 Do not release this package because ... >>> >>> To learn more about Apache Spark, please see http://spark.apache.org/ >>> >>> The tag to be voted on is v3.0.0-rc2 (commit >>> 29853eca69bceefd227cbe8421a09c116b7b753a): >>> https://github.com/apache/spark/tree/v3.0.0-rc2 >>> >>> The release files, including signatures, digests, etc. can be found at: >>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc2-bin/ >>> >>> Signatures used for Spark RCs can be found in this file: >>> https://dist.apache.org/repos/dist/dev/spark/KEYS >>> >>> The staging repository for this release can be found at: >>> https://repository.apache.org/content/repositories/orgapachespark-1345/ >>> >>> The documentation corresponding to this release can be found at: >>> https://dist.apache.org/repos/dist/dev/spark/v3.0.0-rc2-docs/ >>> >>> The list of bug fixes going into 3.0.0 can be found at the following URL: >>> https://issues.apache.org/jira/projects/SPARK/versions/12339177 >>> >>> This release is using the release script of the tag v3.0.0-rc2. >>> >>> FAQ >>> >>> ========================= >>> How can I help test this release? >>> ========================= >>> >>> If you are a Spark user, you can help us test this release by taking >>> an existing Spark workload and running on this release candidate, then >>> reporting any regressions. >>> >>> If you're working in PySpark you can set up a virtual env and install >>> the current RC and see if anything important breaks, in the Java/Scala >>> you can add the staging repository to your projects resolvers and test >>> with the RC (make sure to clean up the artifact cache before/after so >>> you don't end up building with a out of date RC going forward). >>> >>> =========================================== >>> What should happen to JIRA tickets still targeting 3.0.0? >>> =========================================== >>> >>> The current list of open tickets targeted at 3.0.0 can be found at: >>> https://issues.apache.org/jira/projects/SPARK and search for "Target >>> Version/s" = 3.0.0 >>> >>> Committers should look at those and triage. Extremely important bug >>> fixes, documentation, and API tweaks that impact compatibility should >>> be worked on immediately. Everything else please retarget to an >>> appropriate release. >>> >>> ================== >>> But my bug isn't fixed? >>> ================== >>> >>> In order to make timely releases, we will typically not hold the >>> release unless the bug in question is a regression from the previous >>> release. That being said, if there is something which is a regression >>> that has not been correctly targeted please ping me or a committer to >>> help target the issue. >>> >>> >>> -- Ryan Blue Software Engineer Netflix