+1 (binding).

Verified Signature and SHA12.

Based on the changes (and Changelog) I can verify that the following
providers should work fine:


   - spark
   - kubernetes
   - jenkins
   - microsoft.azure
   - mysql
   - telegram
   - and all the ones that just have doc changes


Regards,
Kaxil

On Tue, Mar 2, 2021 at 9:01 PM Ryan Hatter <[email protected]> wrote:

> There were some changes to the operator after my PR was merged:
> https://github.com/apache/airflow/blob/master/airflow/providers/google/cloud/transfers/gdrive_to_gcs.py
>
> Pak Andrey (Scuall1992 on GitHub) might be able to confirm the operator is
> functional.
>
> On Mar 2, 2021, at 13:16, Jarek Potiuk <[email protected]> wrote:
>
> 
> Hello everyone - just a reminder that we have voting (hopefully) finishing
> tomorrow.
>
> I'd love to get some votes for that.
>
> Just to clarify what the PMC votes mean, because I believe there were some
> question raised about the release process which we are going to discuss it
> tomorrow at the dev call but let me just express my interpretation of
> https://infra.apache.org/release-publishing.html
>
> PMC member vote (as I understand it) does not mean that this PMC member
> tested the release functionality (neither Release Manager).
> This merely means that the PMC member agrees that the software was
> released according to the requirements and process described in
> https://infra.apache.org/release-publishing.html and that the signatures,
> hash-sums and software packages are as expected by the process.
> This is how I interpret this part of the release process "Release managers
> do the mechanical work; but the PMC in general, and the PMC chair in
> particular (as an officer of the Foundation), are responsible for
> compliance with ASF requirements."
>
> My understanding is that it is not feasible (neither for Airflow nor
> Providers) that the PMC members (nor release manager) tests the software
> and all features/bugfixes. We've never done that and I believe we will
> never do. We are reaching out to the community to test and we make a best
> effort to test whatever we release automatically (unit tests, integration
> tests, testing if providers are installable/importable with Airflow 2.0 and
> latest source code  of Airflow). And we hardly can do more than that.
>
> Happy to discuss it tomorrow, but in the meantime If some of the PMCs
> could do the review of the process and check the compliance, to be ready to
> cast your votes - I'd love that.
>
> J.
>
> On Tue, Mar 2, 2021 at 8:44 PM Jarek Potiuk <[email protected]> wrote:
>
>> Hey Ryan,
>>
>> There is no **must** in re-testing it. Providing that you tested it
>> before with real GSuite account is for me enough of a confirmation ;).
>>
>> J.
>>
>> On Sun, Feb 28, 2021 at 10:00 PM Abdur-Rahmaan Janhangeer <
>> [email protected]> wrote:
>>
>>> Salutes for having a GSuite account just for the functionality 👍👍👍
>>>
>>> On Mon, 1 Mar 2021, 00:05 Ryan Hatter, <[email protected]> wrote:
>>>
>>>> I canceled my GSuite account when my PR for the gdrive to gcs operator
>>>> was approved & merged. Could anyone maybe help me ensure correct
>>>> functionality?
>>>>
>>>>
>>>> On Feb 27, 2021, at 08:48, Jarek Potiuk <[email protected]> wrote:
>>>>
>>>> 
>>>> I created issue, where we will track the status of tests for the
>>>> providers (again - it is experiment - but I'd really love to get feedback
>>>> on the new providers from those who contributed):
>>>> https://github.com/apache/airflow/issues/14511
>>>>
>>>> On Sat, Feb 27, 2021 at 4:28 PM Jarek Potiuk <[email protected]> wrote:
>>>>
>>>>>
>>>>> Hey all,
>>>>>
>>>>> I have just cut the new wave Airflow Providers packages. This email is
>>>>> calling a vote on the release,
>>>>> which will last for 72 hours +  day for the weekend - which means that
>>>>> it will end on Wed 3 Mar 15:59:34 CET 2021.
>>>>>
>>>>> Consider this my (binding) +1.
>>>>>
>>>>> *KIND REQUEST*
>>>>>
>>>>> There was a recent discussion about test quality of the providers and
>>>>> I would like to try to address it, still keeping the batch release process
>>>>> every 3 weeks.
>>>>>
>>>>> We need a bit of help from the community. I have a kind request to the
>>>>> authors of fixes and new features. I group the providers into those that
>>>>> likely need more testing, and those that do not. I also added names of
>>>>> those who submitted the changes and are most likely to be able to verify 
>>>>> if
>>>>> the RC packages are solving the problems/adding features.
>>>>>
>>>>> This is a bit of experiment (apologies for calling out)  - but if we
>>>>> find that it works, we can automate that. I will create a separate Issue 
>>>>> in
>>>>> Github where you will be able to "tick" the boxes for those providers 
>>>>> which
>>>>> they are added to. It would not be a blocker if not tested, but it will be
>>>>> a great help if you could test the new RC provider and see if it works as
>>>>> expected according to your changes.
>>>>>
>>>>> Providers with new features and fixes - likely needs some testing.:
>>>>>
>>>>> * *amazon* : Cristòfol Torrens, Ruben Laguna, Arati Nagmal, Ivica
>>>>> Kolenkaš, JavierLopezT
>>>>> * *apache.druid*: Xinbin Huang
>>>>> * *apache.spark*: Igor Khrol
>>>>> * *cncf.kubernetes*: jpyen, Ash Berlin-Taylor, Daniel Imberman
>>>>> * *google*:  Vivek Bhojawala, Xinbin Huang, Pak Andrey, uma66, Ryan
>>>>> Yuan, morrme, Sam Wheating, YingyingPeng22, Ryan Hatter,Tobiasz Kędzierski
>>>>> * *jenkins*: Maxim Lisovsky
>>>>> * *microsift.azure*: flvndh, yyu
>>>>> * *mysql*: Constantino Schillebeeckx
>>>>> * *qubole*: Xinbin Huang
>>>>> * *salesforce*: Jyoti Dhiman
>>>>> * *slack*: Igor Khrol
>>>>> * *tableau*: Jyoti Dhiman
>>>>> * *telegram*: Shekhar Sing, Adil Khashtamov
>>>>>
>>>>> Providers with doc only changes (no need to test):
>>>>>
>>>>> * apache-beam
>>>>> * apache-hive
>>>>> * dingding
>>>>> * docker
>>>>> * elasticsearch
>>>>> * exasol
>>>>> * http
>>>>> * neo4j
>>>>> * openfaas
>>>>> * papermill
>>>>> * presto
>>>>> * sendgrid
>>>>> * sftp
>>>>> * snowflake
>>>>> * sqlite
>>>>> * ssh
>>>>> *
>>>>>
>>>>>
>>>>> Airflow Providers are available at:
>>>>> https://dist.apache.org/repos/dist/dev/airflow/providers/
>>>>>
>>>>> *apache-airflow-providers-<PROVIDER>-*-bin.tar.gz* are the binary
>>>>>  Python "sdist" release - they are also official "sources" for the
>>>>> provider packages.
>>>>>
>>>>> *apache_airflow_providers_<PROVIDER>-*.whl are the binary
>>>>>  Python "wheel" release.
>>>>>
>>>>> The test procedure for PMC members who would like to test the RC
>>>>> candidates are described in
>>>>>
>>>>> https://github.com/apache/airflow/blob/master/dev/README_RELEASE_PROVIDER_PACKAGES.md#verify-the-release-by-pmc-members
>>>>>
>>>>> and for Contributors:
>>>>>
>>>>>
>>>>> https://github.com/apache/airflow/blob/master/dev/README_RELEASE_PROVIDER_PACKAGES.md#verify-by-contributors
>>>>>
>>>>>
>>>>> Public keys are available at:
>>>>> https://dist.apache.org/repos/dist/release/airflow/KEYS
>>>>>
>>>>> Please vote accordingly:
>>>>>
>>>>> [ ] +1 approve
>>>>> [ ] +0 no opinion
>>>>> [ ] -1 disapprove with the reason
>>>>>
>>>>>
>>>>> Only votes from PMC members are binding, but members of the community
>>>>> are
>>>>> encouraged to test the release and vote with "(non-binding)".
>>>>>
>>>>> Please note that the version number excludes the 'rcX' string.
>>>>> This will allow us to rename the artifact without modifying
>>>>> the artifact checksums when we actually release.
>>>>>
>>>>>
>>>>> Each of the packages contains a link to the detailed changelog. The
>>>>> changelogs are moved to the official airflow documentation:
>>>>> https://github.com/apache/airflow-site/<TODO COPY LINK TO BRANCH>
>>>>>
>>>>> <PASTE ANY HIGH-LEVEL DESCRIPTION OF THE CHANGES HERE!>
>>>>>
>>>>>
>>>>> Note the links to documentation from PyPI packages are not working
>>>>> until we merge
>>>>> the changes to airflow site after releasing the packages officially.
>>>>>
>>>>> https://pypi.org/project/apache-airflow-providers-amazon/1.2.0rc1/
>>>>> https://pypi.org/project/apache-airflow-providers-apache-beam/1.0.1rc1/
>>>>>
>>>>> https://pypi.org/project/apache-airflow-providers-apache-druid/1.1.0rc1/
>>>>> https://pypi.org/project/apache-airflow-providers-apache-hive/1.0.2rc1/
>>>>>
>>>>> https://pypi.org/project/apache-airflow-providers-apache-spark/1.0.2rc1/
>>>>>
>>>>> https://pypi.org/project/apache-airflow-providers-cncf-kubernetes/1.0.2rc1/
>>>>> https://pypi.org/project/apache-airflow-providers-dingding/1.0.2rc1/
>>>>> https://pypi.org/project/apache-airflow-providers-docker/1.0.2rc1/
>>>>>
>>>>> https://pypi.org/project/apache-airflow-providers-elasticsearch/1.0.2rc1/
>>>>> https://pypi.org/project/apache-airflow-providers-exasol/1.1.1rc1/
>>>>> https://pypi.org/project/apache-airflow-providers-google/2.1.0rc1/
>>>>> https://pypi.org/project/apache-airflow-providers-http/1.1.1rc1/
>>>>> https://pypi.org/project/apache-airflow-providers-jenkins/1.1.0rc1/
>>>>>
>>>>> https://pypi.org/project/apache-airflow-providers-microsoft-azure/1.2.0rc1/
>>>>> https://pypi.org/project/apache-airflow-providers-mysql/1.0.2rc1/
>>>>> https://pypi.org/project/apache-airflow-providers-neo4j/1.0.1rc1/
>>>>> https://pypi.org/project/apache-airflow-providers-openfaas/1.1.1rc1/
>>>>> https://pypi.org/project/apache-airflow-providers-papermill/1.0.2rc1/
>>>>> https://pypi.org/project/apache-airflow-providers-presto/1.0.2rc1/
>>>>> https://pypi.org/project/apache-airflow-providers-qubole/1.0.2rc1/
>>>>> https://pypi.org/project/apache-airflow-providers-salesforce/2.0.0rc1/
>>>>> https://pypi.org/project/apache-airflow-providers-sendgrid/1.0.2rc1/
>>>>> https://pypi.org/project/apache-airflow-providers-sftp/1.1.1rc1/
>>>>> https://pypi.org/project/apache-airflow-providers-slack/3.0.0rc1/
>>>>> https://pypi.org/project/apache-airflow-providers-snowflake/1.1.1rc1/
>>>>> https://pypi.org/project/apache-airflow-providers-sqlite/1.0.2rc1/
>>>>> https://pypi.org/project/apache-airflow-providers-ssh/1.2.0rc1/
>>>>> https://pypi.org/project/apache-airflow-providers-tableau/1.0.0rc1/
>>>>> https://pypi.org/project/apache-airflow-providers-telegram/1.0.2rc1/
>>>>>
>>>>> Cheers,
>>>>> J.
>>>>>
>>>>>
>>>>>
>>>>> --
>>>>> +48 660 796 129
>>>>>
>>>>
>>>>
>>>> --
>>>> +48 660 796 129
>>>>
>>>>
>>
>> --
>> +48 660 796 129
>>
>
>
> --
> +48 660 796 129
>
>

Reply via email to