Re: ASF board report draft for November

2022-11-10 Thread Matei Zaharia
Sounds good. > On Nov 7, 2022, at 12:02 PM, Dongjoon Hyun wrote: > > Shall we mention Spark 3.2.3 release preparation since Chao is currently > actively working on it? > > Dongjoon. > > On Mon, Nov 7, 2022 at 11:53 AM Matei Zaharia > wrote: > It’s time to

Re: ASF board report draft for November

2022-11-07 Thread Dongjoon Hyun
Shall we mention Spark 3.2.3 release preparation since Chao is currently actively working on it? Dongjoon. On Mon, Nov 7, 2022 at 11:53 AM Matei Zaharia wrote: > It’s time to send our quarterly report to the ASF board on Wednesday. Here > is a draft, let me know if you have suggestions: > >

ASF board report draft for November

2022-11-07 Thread Matei Zaharia
It’s time to send our quarterly report to the ASF board on Wednesday. Here is a draft, let me know if you have suggestions: === Description: Apache Spark is a fast and general purpose engine for large-scale data processing. It offers high-level APIs in Java, Scala, Python,

Re: ASF board report draft for November

2021-11-10 Thread Matei Zaharia
Sounds good, I’ll fix that. Matei > On Nov 9, 2021, at 12:39 AM, Mich Talebzadeh > wrote: > > Hi, > > Just a minor modification > > Under Description: > > Apache Spark is a fast and general engine for large-scale data processing. > > It should read > > Apache Spark is a fast and

Re: ASF board report draft for November

2021-11-09 Thread Mich Talebzadeh
Hi, Just a minor modification Under Description: Apache Spark is a fast and general engine for large-scale data processing. It should read Apache Spark is a fast and general purpose engine for large-scale data processing. HTH view my Linkedin profile

ASF board report draft for November

2021-11-09 Thread Matei Zaharia
Hi all, Our ASF board report needs to be submitted again this Wednesday (November 10). I wrote a draft with the major things that happened in the past three months — let me know if I missed something. === Description: Apache Spark is a fast and general engine for large-scale data