+1 Thank you, William
On 2023/02/13 07:32:49 John Zhuge wrote: > +1 (non-binding) > > Rebased internal branch. Passed build with Java 8 and Scala 2.12. Passed > integration tests with Python 3.10. > > On Sun, Feb 12, 2023 at 8:49 PM Yuming Wang <wgy...@gmail.com> wrote: > > > +1. > > > > On Mon, Feb 13, 2023 at 11:52 AM yangjie01 <yangji...@baidu.com> wrote: > > > >> +1, Test 3.3.2-rc1 with Java 17 + Scala 2.13 + Python 3.10, all test > >> passed. > >> > >> > >> > >> Yang Jie > >> > >> > >> > >> *发件人**: *Yikun Jiang <yikunk...@gmail.com> > >> *日期**: *2023年2月13日 星期一 11:47 > >> *收件人**: *Spark dev list <dev@spark.apache.org> > >> *抄送**: *"L. C. Hsieh" <vii...@gmail.com> > >> *主题**: *Re: [VOTE] Release Spark 3.3.2 (RC1) > >> > >> > >> > >> +1, Test 3.3.2-rc1 with spark-docker: > >> > >> - Downloading rc4 tgz, validate the key. > >> > >> - Extract bin and build image > >> > >> - Run K8s IT, standalone test of R/Python/Scala/All image [1] > >> > >> > >> > >> [1] https://github.com/apache/spark-docker/pull/29 > >> <https://mailshield.baidu.com/check?q=tplLcLNdlj8rjQHvpEsgIMJTrTLI2gUKRg9Em2g%2fJfuHdKyeQwaVOrb0vDUH%2bShDbaLpNw%3d%3d> > >> > >> > >> > >> Regards, > >> > >> Yikun > >> > >> > >> > >> > >> > >> On Mon, Feb 13, 2023 at 10:25 AM yangjie01 <yangji...@baidu.com> wrote: > >> > >> Which Python version do you use for testing? When I use the latest Python > >> 3.11, I can reproduce similar test failures (43 tests of sql module fail), > >> but when I use python 3.10, they will succeed > >> > >> > >> > >> YangJie > >> > >> > >> > >> *发件人**: *Bjørn Jørgensen <bjornjorgen...@gmail.com> > >> *日期**: *2023年2月13日 星期一 05:09 > >> *收件人**: *Sean Owen <sro...@gmail.com> > >> *抄送**: *"L. C. Hsieh" <vii...@gmail.com>, Spark dev list < > >> dev@spark.apache.org> > >> *主题**: *Re: [VOTE] Release Spark 3.3.2 (RC1) > >> > >> > >> > >> Tried it one more time and the same result. > >> > >> > >> > >> On another box with Manjaro > >> > >> ------------------------------------------------------------------------ > >> [INFO] Reactor Summary for Spark Project Parent POM 3.3.2: > >> [INFO] > >> [INFO] Spark Project Parent POM ........................... SUCCESS > >> [01:50 min] > >> [INFO] Spark Project Tags ................................. SUCCESS [ > >> 17.359 s] > >> [INFO] Spark Project Sketch ............................... SUCCESS [ > >> 12.517 s] > >> [INFO] Spark Project Local DB ............................. SUCCESS [ > >> 14.463 s] > >> [INFO] Spark Project Networking ........................... SUCCESS > >> [01:07 min] > >> [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ > >> 9.013 s] > >> [INFO] Spark Project Unsafe ............................... SUCCESS [ > >> 8.184 s] > >> [INFO] Spark Project Launcher ............................. SUCCESS [ > >> 10.454 s] > >> [INFO] Spark Project Core ................................. SUCCESS > >> [23:58 min] > >> [INFO] Spark Project ML Local Library ..................... SUCCESS [ > >> 21.218 s] > >> [INFO] Spark Project GraphX ............................... SUCCESS > >> [01:24 min] > >> [INFO] Spark Project Streaming ............................ SUCCESS > >> [04:57 min] > >> [INFO] Spark Project Catalyst ............................. SUCCESS > >> [08:00 min] > >> [INFO] Spark Project SQL .................................. SUCCESS [ > >> 01:02 h] > >> [INFO] Spark Project ML Library ........................... SUCCESS > >> [14:38 min] > >> [INFO] Spark Project Tools ................................ SUCCESS [ > >> 4.394 s] > >> [INFO] Spark Project Hive ................................. SUCCESS > >> [53:43 min] > >> [INFO] Spark Project REPL ................................. SUCCESS > >> [01:16 min] > >> [INFO] Spark Project Assembly ............................. SUCCESS [ > >> 2.186 s] > >> [INFO] Kafka 0.10+ Token Provider for Streaming ........... SUCCESS [ > >> 16.150 s] > >> [INFO] Spark Integration for Kafka 0.10 ................... SUCCESS > >> [01:34 min] > >> [INFO] Kafka 0.10+ Source for Structured Streaming ........ SUCCESS > >> [32:55 min] > >> [INFO] Spark Project Examples ............................. SUCCESS [ > >> 23.800 s] > >> [INFO] Spark Integration for Kafka 0.10 Assembly .......... SUCCESS [ > >> 7.301 s] > >> [INFO] Spark Avro ......................................... SUCCESS > >> [01:19 min] > >> [INFO] > >> ------------------------------------------------------------------------ > >> [INFO] BUILD SUCCESS > >> [INFO] > >> ------------------------------------------------------------------------ > >> [INFO] Total time: 03:31 h > >> [INFO] Finished at: 2023-02-12T21:54:20+01:00 > >> [INFO] > >> ------------------------------------------------------------------------ > >> [bjorn@amd7g spark-3.3.2]$ java -version > >> openjdk version "17.0.6" 2023-01-17 > >> OpenJDK Runtime Environment (build 17.0.6+10) > >> OpenJDK 64-Bit Server VM (build 17.0.6+10, mixed mode) > >> > >> > >> > >> > >> > >> :) > >> > >> > >> > >> So I'm +1 > >> > >> > >> > >> > >> > >> søn. 12. feb. 2023 kl. 12:53 skrev Bjørn Jørgensen < > >> bjornjorgen...@gmail.com>: > >> > >> I use ubuntu rolling > >> > >> $ java -version > >> openjdk version "17.0.6" 2023-01-17 > >> OpenJDK Runtime Environment (build 17.0.6+10-Ubuntu-0ubuntu1) > >> OpenJDK 64-Bit Server VM (build 17.0.6+10-Ubuntu-0ubuntu1, mixed mode, > >> sharing) > >> > >> > >> > >> I have reboot now and restart ./build/mvn clean package > >> > >> > >> > >> > >> > >> > >> > >> søn. 12. feb. 2023 kl. 04:47 skrev Sean Owen <sro...@gmail.com>: > >> > >> +1 The tests and all results were the same as ever for me (Java 11, Scala > >> 2.13, Ubuntu 22.04) > >> > >> I also didn't see that issue ... maybe somehow locale related? which > >> could still be a bug. > >> > >> > >> > >> On Sat, Feb 11, 2023 at 8:49 PM L. C. Hsieh <vii...@gmail.com> wrote: > >> > >> Thank you for testing it. > >> > >> I was going to run it again but still didn't see any errors. > >> > >> I also checked CI (and looked again now) on branch-3.3 before cutting RC. > >> > >> BTW, I didn't find an actual test failure (i.e. "- test_name *** > >> FAILED ***") in the log file. > >> > >> Maybe it is due to the dev env? What dev env you're using to run the test? > >> > >> > >> On Sat, Feb 11, 2023 at 8:58 AM Bjørn Jørgensen > >> <bjornjorgen...@gmail.com> wrote: > >> > > >> > > >> > ./build/mvn clean package > >> > > >> > Run completed in 1 hour, 18 minutes, 29 seconds. > >> > Total number of tests run: 11652 > >> > Suites: completed 516, aborted 0 > >> > Tests: succeeded 11609, failed 43, canceled 8, ignored 57, pending 0 > >> > *** 43 TESTS FAILED *** > >> > [INFO] > >> ------------------------------------------------------------------------ > >> > [INFO] Reactor Summary for Spark Project Parent POM 3.3.2: > >> > [INFO] > >> > [INFO] Spark Project Parent POM ........................... SUCCESS [ > >> 3.418 s] > >> > [INFO] Spark Project Tags ................................. SUCCESS [ > >> 17.845 s] > >> > [INFO] Spark Project Sketch ............................... SUCCESS [ > >> 20.791 s] > >> > [INFO] Spark Project Local DB ............................. SUCCESS [ > >> 16.527 s] > >> > [INFO] Spark Project Networking ........................... SUCCESS > >> [01:03 min] > >> > [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ > >> 9.914 s] > >> > [INFO] Spark Project Unsafe ............................... SUCCESS [ > >> 12.007 s] > >> > [INFO] Spark Project Launcher ............................. SUCCESS [ > >> 7.620 s] > >> > [INFO] Spark Project Core ................................. SUCCESS > >> [40:04 min] > >> > [INFO] Spark Project ML Local Library ..................... SUCCESS [ > >> 29.997 s] > >> > [INFO] Spark Project GraphX ............................... SUCCESS > >> [02:33 min] > >> > [INFO] Spark Project Streaming ............................ SUCCESS > >> [05:51 min] > >> > [INFO] Spark Project Catalyst ............................. SUCCESS > >> [13:29 min] > >> > [INFO] Spark Project SQL .................................. FAILURE [ > >> 01:25 h] > >> > [INFO] Spark Project ML Library ........................... SKIPPED > >> > [INFO] Spark Project Tools ................................ SKIPPED > >> > [INFO] Spark Project Hive ................................. SKIPPED > >> > [INFO] Spark Project REPL ................................. SKIPPED > >> > [INFO] Spark Project Assembly ............................. SKIPPED > >> > [INFO] Kafka 0.10+ Token Provider for Streaming ........... SKIPPED > >> > [INFO] Spark Integration for Kafka 0.10 ................... SKIPPED > >> > [INFO] Kafka 0.10+ Source for Structured Streaming ........ SKIPPED > >> > [INFO] Spark Project Examples ............................. SKIPPED > >> > [INFO] Spark Integration for Kafka 0.10 Assembly .......... SKIPPED > >> > [INFO] Spark Avro ......................................... SKIPPED > >> > [INFO] > >> ------------------------------------------------------------------------ > >> > [INFO] BUILD FAILURE > >> > [INFO] > >> ------------------------------------------------------------------------ > >> > [INFO] Total time: 02:30 h > >> > [INFO] Finished at: 2023-02-11T17:32:45+01:00 > >> > > >> > lør. 11. feb. 2023 kl. 06:01 skrev L. C. Hsieh <vii...@gmail.com>: > >> >> > >> >> Please vote on releasing the following candidate as Apache Spark > >> version 3.3.2. > >> >> > >> >> The vote is open until Feb 15th 9AM (PST) and passes if a majority +1 > >> >> PMC votes are cast, with a minimum of 3 +1 votes. > >> >> > >> >> [ ] +1 Release this package as Apache Spark 3.3.2 > >> >> [ ] -1 Do not release this package because ... > >> >> > >> >> To learn more about Apache Spark, please see https://spark.apache.org/ > >> <https://mailshield.baidu.com/check?q=iR6md5rYrz%2bpTPJlEXXlR6NN3aGjunZT0DADO3Pcgs0%3d> > >> >> > >> >> The tag to be voted on is v3.3.2-rc1 (commit > >> >> 5103e00c4ce5fcc4264ca9c4df12295d42557af6): > >> >> https://github.com/apache/spark/tree/v3.3.2-rc1 > >> <https://mailshield.baidu.com/check?q=JVB3SgRULBV6o7%2f%2bttBuOSWQ7pos5zRDEjmr726OkkvMCFFjCzV8o1ouG%2bndfSI1ShN28A%3d%3d> > >> >> > >> >> The release files, including signatures, digests, etc. can be found at: > >> >> https://dist.apache.org/repos/dist/dev/spark/v3.3.2-rc1-bin/ > >> <https://mailshield.baidu.com/check?q=EiCIpTlRSz22Mr68Lj2FK7L9OOrivFInwt%2buG0Qq2%2fsiczZ8oNT%2bs0h88iljAnVqyGTvojbZSAVln8NItTTQ8A%3d%3d> > >> >> > >> >> Signatures used for Spark RCs can be found in this file: > >> >> https://dist.apache.org/repos/dist/dev/spark/KEYS > >> <https://mailshield.baidu.com/check?q=E6fHbSXEWw02TTJBpc3bfA9mi7ea0YiWcNHkm%2fDJxwlaWinGnMdaoO1PahHhgj00vKwcbElpuHA%3d> > >> >> > >> >> The staging repository for this release can be found at: > >> >> > >> https://repository.apache.org/content/repositories/orgapachespark-1433/ > >> <https://mailshield.baidu.com/check?q=bUXQTF3XhoCwa%2b98IqBLxJAg%2fHJuB7FufsD0R3sZVlBnWEwzl5k5S%2fGVESXSPQ43GcGsjg3KDAlXHi9eKLeL671S4gk1NeTDBv%2f7nQ%3d%3d> > >> >> > >> >> The documentation corresponding to this release can be found at: > >> >> https://dist.apache.org/repos/dist/dev/spark/v3.3.2-rc1-docs/ > >> <https://mailshield.baidu.com/check?q=CeHz8BoTnhtg6CxlsH20qffHjT4Wngbp3FxuyKIf0vccpabg1s7%2bDbjWTkwadqFA2zZQsXuPRLCL%2f6ycVxGeRWdKv4U%3d> > >> >> > >> >> The list of bug fixes going into 3.3.2 can be found at the following > >> URL: > >> >> https://issues.apache.org/jira/projects/SPARK/versions/12352299 > >> <https://mailshield.baidu.com/check?q=rjLgfSQlv1N%2fnYqk65MQtTLxU5T5bPEmhdY99POorxaUjV5LNpNzD3j68xtXUnYB4xH84qwa0lPrY%2fkYCeoJh9x0PL8%3d> > >> >> > >> >> This release is using the release script of the tag v3.3.2-rc1. > >> >> > >> >> FAQ > >> >> > >> >> ========================= > >> >> How can I help test this release? > >> >> ========================= > >> >> > >> >> If you are a Spark user, you can help us test this release by taking > >> >> an existing Spark workload and running on this release candidate, then > >> >> reporting any regressions. > >> >> > >> >> If you're working in PySpark you can set up a virtual env and install > >> >> the current RC and see if anything important breaks, in the Java/Scala > >> >> you can add the staging repository to your projects resolvers and test > >> >> with the RC (make sure to clean up the artifact cache before/after so > >> >> you don't end up building with a out of date RC going forward). > >> >> > >> >> =========================================== > >> >> What should happen to JIRA tickets still targeting 3.3.2? > >> >> =========================================== > >> >> > >> >> The current list of open tickets targeted at 3.3.2 can be found at: > >> >> https://issues.apache.org/jira/projects/SPARK > >> <https://mailshield.baidu.com/check?q=4UUpJqq41y71Gnuj0qTUYo6hTjqzT7oytN6x%2fvgC5XUtQUC8MfJ77tj7K70O%2f1QMmNoa1A%3d%3d> > >> and search for "Target > >> >> Version/s" = 3.3.2 > >> >> > >> >> Committers should look at those and triage. Extremely important bug > >> >> fixes, documentation, and API tweaks that impact compatibility should > >> >> be worked on immediately. Everything else please retarget to an > >> >> appropriate release. > >> >> > >> >> ================== > >> >> But my bug isn't fixed? > >> >> ================== > >> >> > >> >> In order to make timely releases, we will typically not hold the > >> >> release unless the bug in question is a regression from the previous > >> >> release. That being said, if there is something which is a regression > >> >> that has not been correctly targeted please ping me or a committer to > >> >> help target the issue. > >> >> > >> >> --------------------------------------------------------------------- > >> >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > >> >> > >> > > >> > > >> > -- > >> > Bjørn Jørgensen > >> > Vestre Aspehaug 4, 6010 Ålesund > >> > Norge > >> > > >> > +47 480 94 297 > >> > >> --------------------------------------------------------------------- > >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org > >> > >> > >> > >> > >> -- > >> > >> Bjørn Jørgensen > >> Vestre Aspehaug 4, 6010 Ålesund > >> Norge > >> > >> +47 480 94 297 > >> > >> > >> > >> > >> -- > >> > >> Bjørn Jørgensen > >> Vestre Aspehaug 4, 6010 Ålesund > >> Norge > >> > >> +47 480 94 297 > >> > >> > > -- > John Zhuge > --------------------------------------------------------------------- To unsubscribe e-mail: dev-unsubscr...@spark.apache.org