+1 (non-binding)

On Wed, Apr 14, 2021 at 6:39 PM Dongjoon Hyun <dongjoon.h...@gmail.com>
wrote:

> +1
>
> Bests,
> Dongjoon.
>
> On Tue, Apr 13, 2021 at 10:38 PM Kent Yao <yaooq...@gmail.com> wrote:
>
>> +1 (non-binding)
>>
>> *Kent Yao *
>> @ Data Science Center, Hangzhou Research Institute, NetEase Corp.
>> *a spark enthusiast*
>> *kyuubi <https://github.com/yaooqinn/kyuubi>is a
>> unified multi-tenant JDBC interface for large-scale data processing and
>> analytics, built on top of Apache Spark <http://spark.apache.org/>.*
>> *spark-authorizer <https://github.com/yaooqinn/spark-authorizer>A Spark
>> SQL extension which provides SQL Standard Authorization for **Apache
>> Spark <http://spark.apache.org/>.*
>> *spark-postgres <https://github.com/yaooqinn/spark-postgres> A library
>> for reading data from and transferring data to Postgres / Greenplum with
>> Spark SQL and DataFrames, 10~100x faster.*
>> *spark-func-extras <https://github.com/yaooqinn/spark-func-extras>A
>> library that brings excellent and useful functions from various modern
>> database management systems to Apache Spark <http://spark.apache.org/>.*
>>
>>
>>
>> On 04/14/2021 13:36,Gengliang Wang<ltn...@gmail.com> <ltn...@gmail.com>
>> wrote:
>>
>> +1 (non-binding)
>>
>> On Wed, Apr 14, 2021 at 1:34 PM Jungtaek Lim <
>> kabhwan.opensou...@gmail.com> wrote:
>>
>>> +1 (non-binding)
>>>
>>> signature OK, extracting tgz files OK, build source without running
>>> tests OK.
>>>
>>> On Tue, Apr 13, 2021 at 5:02 PM Herman van Hovell <her...@databricks.com>
>>> wrote:
>>>
>>>> +1
>>>>
>>>> On Tue, Apr 13, 2021 at 2:40 AM sarutak <saru...@oss.nttdata.com>
>>>> wrote:
>>>>
>>>>> +1 (non-binding)
>>>>>
>>>>> > +1
>>>>> >
>>>>> > On Tue, 13 Apr 2021, 02:58 Sean Owen, <sro...@gmail.com> wrote:
>>>>> >
>>>>> >> +1 same result as last RC for me.
>>>>> >>
>>>>> >> On Mon, Apr 12, 2021, 12:53 AM Liang-Chi Hsieh <vii...@gmail.com>
>>>>> >> wrote:
>>>>> >>
>>>>> >>> Please vote on releasing the following candidate as Apache Spark
>>>>> >>> version
>>>>> >>> 2.4.8.
>>>>> >>>
>>>>> >>> The vote is open until Apr 15th at 9AM PST and passes if a
>>>>> >>> majority +1 PMC
>>>>> >>> votes are cast, with a minimum of 3 +1 votes.
>>>>> >>>
>>>>> >>> [ ] +1 Release this package as Apache Spark 2.4.8
>>>>> >>> [ ] -1 Do not release this package because ...
>>>>> >>>
>>>>> >>> To learn more about Apache Spark, please see
>>>>> >>> http://spark.apache.org/
>>>>> >>>
>>>>> >>> There are currently no issues targeting 2.4.8 (try project = SPARK
>>>>> >>> AND
>>>>> >>> "Target Version/s" = "2.4.8" AND status in (Open, Reopened, "In
>>>>> >>> Progress"))
>>>>> >>>
>>>>> >>> The tag to be voted on is v2.4.8-rc2 (commit
>>>>> >>> a0ab27ca6b46b8e5a7ae8bb91e30546082fc551c):
>>>>> >>> https://github.com/apache/spark/tree/v2.4.8-rc2
>>>>> >>>
>>>>> >>> The release files, including signatures, digests, etc. can be
>>>>> >>> found at:
>>>>> >>> https://dist.apache.org/repos/dist/dev/spark/v2.4.8-rc2-bin/
>>>>> >>>
>>>>> >>> Signatures used for Spark RCs can be found in this file:
>>>>> >>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>>> >>>
>>>>> >>> The staging repository for this release can be found at:
>>>>> >>>
>>>>> >>
>>>>> >
>>>>> https://repository.apache.org/content/repositories/orgapachespark-1373/
>>>>> >>>
>>>>> >>> The documentation corresponding to this release can be found at:
>>>>> >>> https://dist.apache.org/repos/dist/dev/spark/v2.4.8-rc2-docs/
>>>>> >>>
>>>>> >>> The list of bug fixes going into 2.4.8 can be found at the
>>>>> >>> following URL:
>>>>> >>> https://s.apache.org/spark-v2.4.8-rc2
>>>>> >>>
>>>>> >>> This release is using the release script of the tag v2.4.8-rc2.
>>>>> >>>
>>>>> >>> FAQ
>>>>> >>>
>>>>> >>> =========================
>>>>> >>> How can I help test this release?
>>>>> >>> =========================
>>>>> >>>
>>>>> >>> If you are a Spark user, you can help us test this release by
>>>>> >>> taking
>>>>> >>> an existing Spark workload and running on this release candidate,
>>>>> >>> then
>>>>> >>> reporting any regressions.
>>>>> >>>
>>>>> >>> If you're working in PySpark you can set up a virtual env and
>>>>> >>> install
>>>>> >>> the current RC and see if anything important breaks, in the
>>>>> >>> Java/Scala
>>>>> >>> you can add the staging repository to your projects resolvers and
>>>>> >>> test
>>>>> >>> with the RC (make sure to clean up the artifact cache before/after
>>>>> >>> so
>>>>> >>> you don't end up building with an out of date RC going forward).
>>>>> >>>
>>>>> >>> ===========================================
>>>>> >>> What should happen to JIRA tickets still targeting 2.4.8?
>>>>> >>> ===========================================
>>>>> >>>
>>>>> >>> The current list of open tickets targeted at 2.4.8 can be found
>>>>> >>> at:
>>>>> >>> https://issues.apache.org/jira/projects/SPARK and search for
>>>>> >>> "Target
>>>>> >>> Version/s" = 2.4.8
>>>>> >>>
>>>>> >>> Committers should look at those and triage. Extremely important
>>>>> >>> bug
>>>>> >>> fixes, documentation, and API tweaks that impact compatibility
>>>>> >>> should
>>>>> >>> be worked on immediately. Everything else please retarget to an
>>>>> >>> appropriate release.
>>>>> >>>
>>>>> >>> ==================
>>>>> >>> But my bug isn't fixed?
>>>>> >>> ==================
>>>>> >>>
>>>>> >>> In order to make timely releases, we will typically not hold the
>>>>> >>> release unless the bug in question is a regression from the
>>>>> >>> previous
>>>>> >>> release. That being said, if there is something which is a
>>>>> >>> regression
>>>>> >>> that has not been correctly targeted please ping me or a committer
>>>>> >>> to
>>>>> >>> help target the issue.
>>>>> >>>
>>>>> >>> --
>>>>> >>> Sent from:
>>>>> >>> http://apache-spark-developers-list.1001551.n3.nabble.com/
>>>>> >>>
>>>>> >>>
>>>>> >>
>>>>> > ---------------------------------------------------------------------
>>>>> >>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>>>>
>>>>> ---------------------------------------------------------------------
>>>>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>>>>
>>>>>

Reply via email to