+1 Thanks, Dongjoon!

Xiao



On Mon, May 17, 2021 at 8:45 PM Kent Yao <yaooq...@gmail.com> wrote:

> +1. thanks Dongjoon
>
> *Kent Yao *
> @ Data Science Center, Hangzhou Research Institute, NetEase Corp.
> *a spark enthusiast*
> *kyuubi <https://github.com/yaooqinn/kyuubi>is a unified multi-tenant JDBC
> interface for large-scale data processing and analytics, built on top
> of Apache Spark <http://spark.apache.org/>.*
> *spark-authorizer <https://github.com/yaooqinn/spark-authorizer>A Spark
> SQL extension which provides SQL Standard Authorization for **Apache
> Spark <http://spark.apache.org/>.*
> *spark-postgres <https://github.com/yaooqinn/spark-postgres> A library for
> reading data from and transferring data to Postgres / Greenplum with Spark
> SQL and DataFrames, 10~100x faster.*
> *itatchi <https://github.com/yaooqinn/spark-func-extras>A** library t**hat
> brings useful functions from various modern database management systems to 
> **Apache
> Spark <http://spark.apache.org/>.*
>
>
>
> On 05/18/2021 10:57,John Zhuge<jzh...@apache.org> <jzh...@apache.org>
> wrote:
>
> +1, thanks Dongjoon!
>
> On Mon, May 17, 2021 at 7:50 PM Yuming Wang <wgy...@gmail.com> wrote:
>
>> +1.
>>
>> On Tue, May 18, 2021 at 9:06 AM Hyukjin Kwon <gurwls...@gmail.com> wrote:
>>
>>> +1 thanks for driving me
>>>
>>> On Tue, 18 May 2021, 09:33 Holden Karau, <hol...@pigscanfly.ca> wrote:
>>>
>>>> +1 and thanks for volunteering to be the RM :)
>>>>
>>>> On Mon, May 17, 2021 at 4:09 PM Takeshi Yamamuro <linguin....@gmail.com>
>>>> wrote:
>>>>
>>>>> Thank you, Dongjoon~ sgtm, too.
>>>>>
>>>>> On Tue, May 18, 2021 at 7:34 AM Cheng Su <chen...@fb.com.invalid>
>>>>> wrote:
>>>>>
>>>>>> +1 for a new release, thanks Dongjoon!
>>>>>>
>>>>>> Cheng Su
>>>>>>
>>>>>> On 5/17/21, 2:44 PM, "Liang-Chi Hsieh" <vii...@gmail.com> wrote:
>>>>>>
>>>>>>     +1 sounds good. Thanks Dongjoon for volunteering on this!
>>>>>>
>>>>>>
>>>>>>     Liang-Chi
>>>>>>
>>>>>>
>>>>>>     Dongjoon Hyun-2 wrote
>>>>>>     > Hi, All.
>>>>>>     >
>>>>>>     > Since Apache Spark 3.1.1 tag creation (Feb 21),
>>>>>>     > new 172 patches including 9 correctness patches and 4 K8s
>>>>>> patches arrived
>>>>>>     > at branch-3.1.
>>>>>>     >
>>>>>>     > Shall we make a new release, Apache Spark 3.1.2, as the second
>>>>>> release at
>>>>>>     > 3.1 line?
>>>>>>     > I'd like to volunteer for the release manager for Apache Spark
>>>>>> 3.1.2.
>>>>>>     > I'm thinking about starting the first RC next week.
>>>>>>     >
>>>>>>     > $ git log --oneline v3.1.1..HEAD | wc -l
>>>>>>     >      172
>>>>>>     >
>>>>>>     > # Known correctness issues
>>>>>>     > SPARK-34534     New protocol FetchShuffleBlocks in
>>>>>> OneForOneBlockFetcher
>>>>>>     > lead to data loss or correctness
>>>>>>     > SPARK-34545     PySpark Python UDF return inconsistent results
>>>>>> when
>>>>>>     > applying 2 UDFs with different return type to 2 columns together
>>>>>>     > SPARK-34681     Full outer shuffled hash join when building
>>>>>> left side
>>>>>>     > produces wrong result
>>>>>>     > SPARK-34719     fail if the view query has duplicated column
>>>>>> names
>>>>>>     > SPARK-34794     Nested higher-order functions broken in DSL
>>>>>>     > SPARK-34829     transform_values return identical values when
>>>>>> it's used
>>>>>>     > with udf that returns reference type
>>>>>>     > SPARK-34833     Apply right-padding correctly for correlated
>>>>>> subqueries
>>>>>>     > SPARK-35381     Fix lambda variable name issues in nested
>>>>>> DataFrame
>>>>>>     > functions in R APIs
>>>>>>     > SPARK-35382     Fix lambda variable name issues in nested
>>>>>> DataFrame
>>>>>>     > functions in Python APIs
>>>>>>     >
>>>>>>     > # Notable K8s patches since K8s GA
>>>>>>     > SPARK-34674    Close SparkContext after the Main method has
>>>>>> finished
>>>>>>     > SPARK-34948    Add ownerReference to executor configmap to fix
>>>>>> leakages
>>>>>>     > SPARK-34820    add apt-update before gnupg install
>>>>>>     > SPARK-34361    In case of downscaling avoid killing of
>>>>>> executors already
>>>>>>     > known by the scheduler backend in the pod allocator
>>>>>>     >
>>>>>>     > Bests,
>>>>>>     > Dongjoon.
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>     --
>>>>>>     Sent from:
>>>>>> http://apache-spark-developers-list.1001551.n3.nabble.com/
>>>>>>
>>>>>>
>>>>>> ---------------------------------------------------------------------
>>>>>>     To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>> --
>>>>> ---
>>>>> Takeshi Yamamuro
>>>>>
>>>> --
>>>> Twitter: https://twitter.com/holdenkarau
>>>> Books (Learning Spark, High Performance Spark, etc.):
>>>> https://amzn.to/2MaRAG9  <https://amzn.to/2MaRAG9>
>>>> YouTube Live Streams: https://www.youtube.com/user/holdenkarau
>>>>
>>>
>
> --
> John Zhuge
>
>

--

Reply via email to