Re: [VOTE] Release Spark 3.2.1 (RC1)

2022-01-11 Thread Ruifeng Zheng
+1 (non-binding) Thanks, ruifeng zheng --Original-- From: "Cheng Su"

Re: [VOTE] Release Spark 3.2.1 (RC1)

2022-01-11 Thread Cheng Su
+1 (non-binding). Checked commit history and ran some local tests. Thanks, Cheng Su From: Qian Sun Date: Tuesday, January 11, 2022 at 7:55 PM To: huaxin gao Cc: dev Subject: Re: [VOTE] Release Spark 3.2.1 (RC1) +1 Looks good. All integration tests passed. Qian 2022年1月11日 上午2:09,huaxin gao

Re: [VOTE] Release Spark 3.2.1 (RC1)

2022-01-11 Thread Qian Sun
+1 Looks good. All integration tests passed. Qian > 2022年1月11日 上午2:09,huaxin gao 写道: > > Please vote on releasing the following candidate as Apache Spark version > 3.2.1. > > The vote is open until Jan. 13th at 12 PM PST (8 PM UTC) and passes if a > majority > +1 PMC votes are cast, with

Re: [VOTE][SPIP] Support Customized Kubernetes Schedulers Proposal

2022-01-11 Thread Thomas Graves
+1 (binding). One minor note since I haven't had time to look at the implementation details is please make sure resource aware scheduling and the stage level scheduling still work or any caveats are documented. Feel free to ping me if questions in these areas. Tom On Wed, Jan 5, 2022 at 7:07 PM

Re: [VOTE] Release Spark 3.2.1 (RC1)

2022-01-11 Thread Thomas Graves
+1, ran our internal tests and everything looks good. Tom On Mon, Jan 10, 2022 at 12:10 PM huaxin gao wrote: > > Please vote on releasing the following candidate as Apache Spark version > 3.2.1. > > The vote is open until Jan. 13th at 12 PM PST (8 PM UTC) and passes if a > majority > +1 PMC

Re: [VOTE] Release Spark 3.2.1 (RC1)

2022-01-11 Thread Sean Owen
+1 looks good to me. I ran all tests with scala 2.12 and 2.13 and had the same results as 3.2.0 testing. On Mon, Jan 10, 2022 at 12:10 PM huaxin gao wrote: > Please vote on releasing the following candidate as Apache Spark version > 3.2.1. > > The vote is open until Jan. 13th at 12 PM PST (8 PM

Re: Difference in behavior for Spark 3.0 vs Spark 3.1 "create database "

2022-01-11 Thread Wenchen Fan
Hopefully, this StackOverflow answer can solve your problem: https://stackoverflow.com/questions/47523037/how-do-i-configure-pyspark-to-write-to-hdfs-by-default Spark doesn't control the behavior of qualifying paths. It's decided by certain configs and/or config files. On Tue, Jan 11, 2022 at