+1

2021년 6월 21일 (월) 오후 2:19, Dongjoon Hyun <dongjoon.h...@gmail.com>님이 작성:

> +1
>
> Thank you, Yi.
>
> Bests,
> Dongjoon.
>
>
> On Sat, Jun 19, 2021 at 6:57 PM Yuming Wang <wgy...@gmail.com> wrote:
>
>> +1
>>
>> Tested a batch of production query with Thrift Server.
>>
>> On Sat, Jun 19, 2021 at 3:04 PM Mridul Muralidharan <mri...@gmail.com>
>> wrote:
>>
>>>
>>> +1
>>>
>>> Signatures, digests, etc check out fine.
>>> Checked out tag and build/tested with -Pyarn -Phadoop-2.7 -Pmesos
>>> -Pkubernetes
>>>
>>> Regards,
>>> Mridul
>>>
>>> PS: Might be related to some quirk of my local env - the first test run
>>> (after clean + package) usually fails for me (typically for hive tests) -
>>> with a second run succeeding : this is not specific to this RC though.
>>>
>>> On Fri, Jun 18, 2021 at 6:14 PM Liang-Chi Hsieh <vii...@gmail.com>
>>> wrote:
>>>
>>>> +1. Docs looks good. Binary looks good.
>>>>
>>>> Ran simple test and some tpcds queries.
>>>>
>>>> Thanks for working on this!
>>>>
>>>>
>>>> wuyi wrote
>>>> > Please vote on releasing the following candidate as Apache Spark
>>>> version
>>>> > 3.0.3.
>>>> >
>>>> > The vote is open until Jun 21th 3AM (PST) and passes if a majority +1
>>>> PMC
>>>> > votes are cast, with
>>>> > a minimum of 3 +1 votes.
>>>> >
>>>> > [ ] +1 Release this package as Apache Spark 3.0.3
>>>> > [ ] -1 Do not release this package because ...
>>>> >
>>>> > To learn more about Apache Spark, please see
>>>> https://spark.apache.org/
>>>> >
>>>> > The tag to be voted on is v3.0.3-rc1 (commit
>>>> > 65ac1e75dc468f53fc778cd2ce1ba3f21067aab8):
>>>> > https://github.com/apache/spark/tree/v3.0.3-rc1
>>>> >
>>>> > The release files, including signatures, digests, etc. can be found
>>>> at:
>>>> > https://dist.apache.org/repos/dist/dev/spark/v3.0.3-rc1-bin/
>>>> >
>>>> > Signatures used for Spark RCs can be found in this file:
>>>> > https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>> >
>>>> > The staging repository for this release can be found at:
>>>> >
>>>> https://repository.apache.org/content/repositories/orgapachespark-1386/
>>>> >
>>>> > The documentation corresponding to this release can be found at:
>>>> > https://dist.apache.org/repos/dist/dev/spark/v3.0.3-rc1-docs/
>>>> >
>>>> > The list of bug fixes going into 3.0.3 can be found at the following
>>>> URL:
>>>> > https://issues.apache.org/jira/projects/SPARK/versions/12349723
>>>> >
>>>> > This release is using the release script of the tag v3.0.3-rc1.
>>>> >
>>>> > FAQ
>>>> >
>>>> > =========================
>>>> > How can I help test this release?
>>>> > =========================
>>>> >
>>>> > If you are a Spark user, you can help us test this release by taking
>>>> > an existing Spark workload and running on this release candidate, then
>>>> > reporting any regressions.
>>>> >
>>>> > If you're working in PySpark you can set up a virtual env and install
>>>> > the current RC and see if anything important breaks, in the Java/Scala
>>>> > you can add the staging repository to your projects resolvers and test
>>>> > with the RC (make sure to clean up the artifact cache before/after so
>>>> > you don't end up building with a out of date RC going forward).
>>>> >
>>>> > ===========================================
>>>> > What should happen to JIRA tickets still targeting 3.0.3?
>>>> > ===========================================
>>>> >
>>>> > The current list of open tickets targeted at 3.0.3 can be found at:
>>>> > https://issues.apache.org/jira/projects/SPARK and search for "Target
>>>> > Version/s" = 3.0.3
>>>> >
>>>> > Committers should look at those and triage. Extremely important bug
>>>> > fixes, documentation, and API tweaks that impact compatibility should
>>>> > be worked on immediately. Everything else please retarget to an
>>>> > appropriate release.
>>>> >
>>>> > ==================
>>>> > But my bug isn't fixed?
>>>> > ==================
>>>> >
>>>> > In order to make timely releases, we will typically not hold the
>>>> > release unless the bug in question is a regression from the previous
>>>> > release. That being said, if there is something which is a regression
>>>> > that has not been correctly targeted please ping me or a committer to
>>>> > help target the issue.
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> --
>>>> Sent from: http://apache-spark-developers-list.1001551.n3.nabble.com/
>>>>
>>>> ---------------------------------------------------------------------
>>>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>>>
>>>>

Reply via email to