You can see it here:
https://pypi.org/project/apache-airflow-providers-amazon/#history

I cancelled rc1 release for 3.1.0 due to bugs and then released the
new wave of providers including "classifier" changes.

That was a deliberate decision for two packages (amazon and dbt). The
reason for that is that they are conceptually not the "same" packages
as RC1s. They also contain "Trove Classifiers" which 3.0.0rc1 of
amazon and 1.0.0 of dbt did not have. This is different that "yet
another bugfix added", it's conceptually a different package with
different meta-data.

Before I did it, I looked at SemVer:
https://pypi.org/project/apache-airflow-providers-amazon/#history and
I have not found a strict requirement to always start from 0, So I
think it's perfectly fine to omit the .0 releases and go to .1
directly in both cases. I also could not find any reason why it would
create any problem. From the user perspective It would be equivalent
to "yanking" .0 release - only without actually releasing anything.

It also made it easier to release due to automation we have and
mechanics of the release. It made it actually possible to release all
those packages in a single wave rather than splitting them (running
two votes for providers at the same time would be rather confusing).

Do you think it creates any problem? Does it make sense. If that is a
problem, then we could technically even release .0 releases and yank
them immediately, but that is - I believe - completely unnecessary.

J.


On Thu, Mar 17, 2022 at 12:18 AM Kaxil Naik <[email protected]> wrote:
>
> Why is the amazon provider bumped from 3.0.0 to 3.1.1 instead of 3.0.1 ?
>
> On Wed, 16 Mar 2022 at 13:29, Josh Fell <[email protected]> 
> wrote:
>>
>> +1 (non-binding)
>>
>> Gave the new dbt Cloud provider a full runthrough.
>>
>> On Tue, Mar 15, 2022 at 2:27 PM Jarek Potiuk <[email protected]> wrote:
>>>
>>> We have almost all changes in this release tested now :). Looking
>>> forward to a (hopefully) 100% test coverage :) (and voting on the
>>> release too :))
>>>
>>> On Tue, Mar 15, 2022 at 1:35 AM Jarek Potiuk <[email protected]> wrote:
>>> >
>>> > Hey all,
>>> >
>>> > I have just cut the new wave Airflow Providers packages. This email is
>>> > calling a vote on the release,
>>> > which will last for 72 hours - which means that it will end on Fri Mar
>>> > 18 01:40 CET 2022.
>>> >
>>> > Consider this my (binding) +1.
>>> >
>>> > This is another "special" release as again all providers are released,
>>> > due to Trove Classifiers:
>>> > https://pypi.org/search/?q=&o=&c=Framework+%3A%3A+Apache+Airflow&c=Framework+%3A%3A+Apache+Airflow+%3A%3A+Provider
>>> >
>>> > Most of the providers are the same as before with the exception of
>>> > classifiers. The ones that changed are:
>>> >
>>> > * alibaba
>>> > * amazon
>>> > * databricks
>>> > * docker
>>> > * google
>>> > * snowflake
>>> >
>>> > Airflow Providers are available at:
>>> > https://dist.apache.org/repos/dist/dev/airflow/providers/
>>> >
>>> > *apache-airflow-providers-<PROVIDER>-*.tar.gz* are the binary
>>> >  Python "sdist" release - they are also official "sources" for the
>>> > provider packages.
>>> >
>>> > *apache_airflow_providers_<PROVIDER>-*.whl are the binary
>>> >  Python "wheel" release.
>>> >
>>> > The test procedure for PMC members who would like to test the RC
>>> > candidates are described in
>>> > https://github.com/apache/airflow/blob/main/dev/README_RELEASE_PROVIDER_PACKAGES.md#verify-the-release-by-pmc-members
>>> >
>>> > and for Contributors:
>>> >
>>> > https://github.com/apache/airflow/blob/main/dev/README_RELEASE_PROVIDER_PACKAGES.md#verify-by-contributors
>>> >
>>> >
>>> > Public keys are available at:
>>> > https://dist.apache.org/repos/dist/release/airflow/KEYS
>>> >
>>> > Please vote accordingly:
>>> >
>>> > [ ] +1 approve
>>> > [ ] +0 no opinion
>>> > [ ] -1 disapprove with the reason
>>> >
>>> >
>>> > Only votes from PMC members are binding, but members of the community are
>>> > encouraged to test the release and vote with "(non-binding)".
>>> >
>>> > Please note that the version number excludes the 'rcX' string.
>>> > This will allow us to rename the artifact without modifying
>>> > the artifact checksums when we actually release.
>>> >
>>> > The status of testing the providers by the community is kept here:
>>> >
>>> > https://github.com/apache/airflow/issues/22264
>>> >
>>> > You can find packages as well as detailed changelog following the below 
>>> > links:
>>> >
>>> > https://pypi.org/project/apache-airflow-providers-airbyte/2.1.3rc1/
>>> > https://pypi.org/project/apache-airflow-providers-alibaba/1.1.0rc1/
>>> > https://pypi.org/project/apache-airflow-providers-amazon/3.1.1rc1/
>>> > https://pypi.org/project/apache-airflow-providers-apache-beam/3.2.1rc1/
>>> > https://pypi.org/project/apache-airflow-providers-apache-cassandra/2.1.2rc1/
>>> > https://pypi.org/project/apache-airflow-providers-apache-drill/1.0.3rc1/
>>> > https://pypi.org/project/apache-airflow-providers-apache-druid/2.3.2rc1/
>>> > https://pypi.org/project/apache-airflow-providers-apache-hdfs/2.2.2rc1/
>>> > https://pypi.org/project/apache-airflow-providers-apache-hive/2.3.1rc1/
>>> > https://pypi.org/project/apache-airflow-providers-apache-kylin/2.0.3rc1/
>>> > https://pypi.org/project/apache-airflow-providers-apache-livy/2.2.1rc1/
>>> > https://pypi.org/project/apache-airflow-providers-apache-pig/2.0.3rc1/
>>> > https://pypi.org/project/apache-airflow-providers-apache-pinot/2.0.3rc1/
>>> > https://pypi.org/project/apache-airflow-providers-apache-spark/2.1.2rc1/
>>> > https://pypi.org/project/apache-airflow-providers-apache-sqoop/2.1.2rc1/
>>> > https://pypi.org/project/apache-airflow-providers-asana/1.1.2rc1/
>>> > https://pypi.org/project/apache-airflow-providers-celery/2.1.2rc1/
>>> > https://pypi.org/project/apache-airflow-providers-cloudant/2.0.3rc1/
>>> > https://pypi.org/project/apache-airflow-providers-cncf-kubernetes/3.1.1rc1/
>>> > https://pypi.org/project/apache-airflow-providers-databricks/2.4.0rc1/
>>> > https://pypi.org/project/apache-airflow-providers-datadog/2.0.3rc1/
>>> > https://pypi.org/project/apache-airflow-providers-dbt-cloud/1.0.1rc1/
>>> > https://pypi.org/project/apache-airflow-providers-dingding/2.0.3rc1/
>>> > https://pypi.org/project/apache-airflow-providers-discord/2.0.3rc1/
>>> > https://pypi.org/project/apache-airflow-providers-docker/2.5.1rc1/
>>> > https://pypi.org/project/apache-airflow-providers-elasticsearch/3.0.1rc1/
>>> > https://pypi.org/project/apache-airflow-providers-exasol/2.1.2rc1/
>>> > https://pypi.org/project/apache-airflow-providers-facebook/2.2.2rc1/
>>> > https://pypi.org/project/apache-airflow-providers-ftp/2.1.1rc1/
>>> > https://pypi.org/project/apache-airflow-providers-github/1.0.2rc1/
>>> > https://pypi.org/project/apache-airflow-providers-google/6.6.0rc1/
>>> > https://pypi.org/project/apache-airflow-providers-grpc/2.0.3rc1/
>>> > https://pypi.org/project/apache-airflow-providers-hashicorp/2.1.3rc1/
>>> > https://pypi.org/project/apache-airflow-providers-http/2.1.1rc1/
>>> > https://pypi.org/project/apache-airflow-providers-imap/2.2.2rc1/
>>> > https://pypi.org/project/apache-airflow-providers-influxdb/1.1.2rc1/
>>> > https://pypi.org/project/apache-airflow-providers-jdbc/2.1.2rc1/
>>> > https://pypi.org/project/apache-airflow-providers-jenkins/2.0.6rc1/
>>> > https://pypi.org/project/apache-airflow-providers-jira/2.0.3rc1/
>>> > https://pypi.org/project/apache-airflow-providers-microsoft-azure/3.7.1rc1/
>>> > https://pypi.org/project/apache-airflow-providers-microsoft-mssql/2.1.2rc1/
>>> > https://pypi.org/project/apache-airflow-providers-microsoft-psrp/1.1.2rc1/
>>> > https://pypi.org/project/apache-airflow-providers-microsoft-winrm/2.0.4rc1/
>>> > https://pypi.org/project/apache-airflow-providers-mongo/2.3.2rc1/
>>> > https://pypi.org/project/apache-airflow-providers-mysql/2.2.2rc1/
>>> > https://pypi.org/project/apache-airflow-providers-neo4j/2.1.2rc1/
>>> > https://pypi.org/project/apache-airflow-providers-odbc/2.0.3rc1/
>>> > https://pypi.org/project/apache-airflow-providers-openfaas/2.0.2rc1/
>>> > https://pypi.org/project/apache-airflow-providers-opsgenie/3.0.2rc1/
>>> > https://pypi.org/project/apache-airflow-providers-oracle/2.2.2rc1/
>>> > https://pypi.org/project/apache-airflow-providers-pagerduty/2.1.2rc1/
>>> > https://pypi.org/project/apache-airflow-providers-papermill/2.2.2rc1/
>>> > https://pypi.org/project/apache-airflow-providers-plexus/2.0.3rc1/
>>> > https://pypi.org/project/apache-airflow-providers-postgres/4.0.1rc1/
>>> > https://pypi.org/project/apache-airflow-providers-presto/2.1.1rc1/
>>> > https://pypi.org/project/apache-airflow-providers-qubole/2.1.2rc1/
>>> > https://pypi.org/project/apache-airflow-providers-redis/2.0.3rc1/
>>> > https://pypi.org/project/apache-airflow-providers-salesforce/3.4.2rc1/
>>> > https://pypi.org/project/apache-airflow-providers-samba/3.0.3rc1/
>>> > https://pypi.org/project/apache-airflow-providers-segment/2.0.3rc1/
>>> > https://pypi.org/project/apache-airflow-providers-sendgrid/2.0.3rc1/
>>> > https://pypi.org/project/apache-airflow-providers-sftp/2.5.1rc1/
>>> > https://pypi.org/project/apache-airflow-providers-singularity/2.0.3rc1/
>>> > https://pypi.org/project/apache-airflow-providers-slack/4.2.2rc1/
>>> > https://pypi.org/project/apache-airflow-providers-snowflake/2.5.2rc1/
>>> > https://pypi.org/project/apache-airflow-providers-sqlite/2.1.2rc1/
>>> > https://pypi.org/project/apache-airflow-providers-ssh/2.4.2rc1/
>>> > https://pypi.org/project/apache-airflow-providers-tableau/2.1.6rc1/
>>> > https://pypi.org/project/apache-airflow-providers-telegram/2.0.3rc1/
>>> > https://pypi.org/project/apache-airflow-providers-trino/2.1.1rc1/
>>> > https://pypi.org/project/apache-airflow-providers-vertica/2.1.2rc1/
>>> > https://pypi.org/project/apache-airflow-providers-yandex/2.2.2rc1/
>>> > https://pypi.org/project/apache-airflow-providers-zendesk/3.0.2rc1/
>>> >
>>> > Cheers,
>>> > J.

Reply via email to