+1 for this. The rate of API breakages across minor Spark versions are bit untenable anyway.
On Wed, Jan 12, 2022 at 1:22 AM Raymond Xu <xu.shiyan.raym...@gmail.com> wrote: > Hi Chen, > > yes this is actually been worked on by liujinhui > https://issues.apache.org/jira/browse/HUDI-2370 > > and it's planned for 0.11 > > -- > Best, > Raymond > > > On Sun, Jan 9, 2022 at 4:22 PM 陈 翔 <cdmikec...@hotmail.com> wrote: > > > There is an other thing I care about: in spark 3.2.0, parquet 1.12 > > introduces the function of encryption. > > > > > https://spark.apache.org/docs/latest/sql-data-sources-parquet.html#columnar-encryption > > Do we have plans to introduce this feature into hudi? I think this > > function is a great help to the security of the data lake > > > > > > 发件人: Raymond Xu <xu.shiyan.raym...@gmail.com> > > 日期: 星期一, 2022年1月10日 上午5:48 > > 收件人: dev@hudi.apache.org <dev@hudi.apache.org> > > 主题: [DISCUSS] Dropping Spark 3.0.x support in 0.11 > > Hi all, > > > > The incompatible changes from different Spark 3 versions, namely spark > > 3.0.x, spark 3.1.x, and the latest spark 3.2, are making the spark > support > > difficult. I'm proposing to drop Spark 3.0.x support from the next major > > release 0.11.0. Spark 3.0.x has been supported since 0.7.0 through > 0.10.0. > > The Spark 3 support matrix will look like > > > > - 0.11.0 > > - 3.2.0 (default build), 3.1.x > > - 0.10.x > > - 3.1.x (default build), 3.0.x > > - 0.7.0 - 0.9.0 > > - 3.0.x > > - 0.6.0 and prior > > - not supported > > > > Thanks. > > > > Best, > > Raymond > > >