Introducing English SDK for Apache Spark - Seeking Your Feedback and Contributions

2023-07-03 Thread Gengliang Wang
ser-friendly. Thank you in advance for your attention and involvement. We look forward to hearing your thoughts and seeing your contributions! Best, Gengliang Wang

Re: Stickers and Swag

2022-06-14 Thread Gengliang Wang
FYI now you can find the shopping information on https://spark.apache.org/community as well :) Gengliang > On Jun 14, 2022, at 7:47 PM, Hyukjin Kwon wrote: > > Woohoo > > On Tue, 14 Jun 2022 at 15:04, Xiao Li > wrote: >

Re: [ANNOUNCE] Apache Spark 3.2.1 released

2022-01-29 Thread Gengliang Wang
Thanks to Huaxin for driving the release! Fengyu, this is a known issue that will be fixed in the 3.3 release. Currently, the "hadoop3.2" means 3.2 or higher. See the thread https://lists.apache.org/thread/yov8xsggo3g2qr2p1rrr2xtps25wkbvj for more details. On Sat, Jan 29, 2022 at 3:26 PM

Re: [ANNOUNCE] Apache Spark 3.2.0

2021-10-19 Thread Gengliang Wang
/spark/spark-3.2.0/spark-3.2.0-bin-hadoop3.3.tgz > > FYI, unable to download from this location. > Also, I don’t see Hadoop 3.3 version in the dist > > > On Oct 19, 2021, at 9:39 AM, Bode, Meikel, NMA-CFD < > meikel.b...@bertelsmann.de> wrote: > >  > > Man

[ANNOUNCE] Apache Spark 3.2.0

2021-10-19 Thread Gengliang Wang
Hi all, Apache Spark 3.2.0 is the third release of the 3.x line. With tremendous contribution from the open-source community, this release managed to resolve in excess of 1,700 Jira tickets. We'd like to thank our contributors and users for their contributions and early feedback to this release.

Re: spark 3.2 release date

2021-08-30 Thread Gengliang Wang
browse/SPARK-36619 <https://issues.apache.org/jira/browse/SPARK-36619> is resolved. Gengliang Wang > On Aug 31, 2021, at 12:06 PM, infa elance wrote: > > What is the expected ballpark release date of spark 3.2 ? > > Thanks and Regards, > Ajay.

Re: [ANNOUNCE] Announcing Apache Spark 3.0.1

2020-09-11 Thread Gengliang Wang
Congrats! Thanks for the work, Ruifeng! On Fri, Sep 11, 2020 at 9:51 PM Takeshi Yamamuro wrote: > Congrats and thanks, Ruifeng! > > > On Fri, Sep 11, 2020 at 9:50 PM Dongjoon Hyun > wrote: > >> It's great. Thank you, Ruifeng! >> >> Bests, >> Dongjoon. >> >> On Fri, Sep 11, 2020 at 1:54 AM 郑瑞峰

Re: Inner join with the table itself

2018-01-15 Thread Gengliang Wang
Hi Michael, You can use `Explain` to see how your query is optimized. https://docs.databricks.com/spark/latest/spark-sql/language-manual/explain.html I believe your query is an actual cross join, which is usually

Spark-avro 4.0.0 is released

2017-11-10 Thread Gengliang Wang
The 4.0.0 release adds support for Spark 2.2. The published artifact is compatible with both Spark 2.1 and 2.2. New Features: - Support for Spark 2.2 (#242 ): resolve compatibility issue with datasource write API changes