kaxil commented on a change in pull request #11310: URL: https://github.com/apache/airflow/pull/11310#discussion_r501602658
########## File path: dev/README.md ########## @@ -20,240 +20,196 @@ <!-- DON'T EDIT THIS SECTION, INSTEAD RE-RUN doctoc TO UPDATE --> **Table of contents** -- [Development Tools](#development-tools) - - [Airflow release signing tool](#airflow-release-signing-tool) -- [Verifying the release candidate by PMCs (legal)](#verifying-the-release-candidate-by-pmcs-legal) - - [PMC voting](#pmc-voting) - - [SVN check](#svn-check) - - [Verifying the licences](#verifying-the-licences) - - [Verifying the signatures](#verifying-the-signatures) - - [Verifying the SHA512 sum](#verifying-the-sha512-sum) -- [Verifying if the release candidate "works" by Contributors](#verifying-if-the-release-candidate-works-by-contributors) -- [Building an RC](#building-an-rc) -- [PyPI Snapshots](#pypi-snapshots) -- [Make sure your public key is on id.apache.org and in KEYS](#make-sure-your-public-key-is-on-idapacheorg-and-in-keys) -- [Voting on an RC](#voting-on-an-rc) -- [Publishing release](#publishing-release) -- [Publishing to PyPi](#publishing-to-pypi) -- [Updating CHANGELOG.md](#updating-changelogmd) -- [Notifying developers of release](#notifying-developers-of-release) +- [Apache Airflow source releases](#apache-airflow-source-releases) + - [Apache Airflow Package](#apache-airflow-package) + - [Backport Provider packages](#backport-provider-packages) +- [Prerequisites for the release manager preparing the release](#prerequisites-for-the-release-manager-preparing-the-release) + - [Upload Public keys to id.apache.org](#upload-public-keys-to-idapacheorg) + - [Configure PyPI uploads](#configure-pypi-uploads) + - [Hardware used to prepare and verify the packages.](#hardware-used-to-prepare-and-verify-the-packages) +- [Apache Airflow packages](#apache-airflow-packages) + - [Prepare the Apache Airflow Package RC](#prepare-the-apache-airflow-package-rc) + - [Vote and verify the Apache Airflow release candidate](#vote-and-verify-the-apache-airflow-release-candidate) + - [Publish the final Apache Airflow release](#publish-the-final-apache-airflow-release) +- [Backport Provider Packages](#backport-provider-packages) + - [Decide when to release](#decide-when-to-release) + - [Prepare the Backport Provider Packages RC](#prepare-the-backport-provider-packages-rc) + - [Vote and verify the Backport Providers release candidate](#vote-and-verify-the-backport-providers-release-candidate) + - [Publish the final releases of backport packages](#publish-the-final-releases-of-backport-packages) <!-- END doctoc generated TOC please keep comment here to allow auto update --> -# Development Tools +# Apache Airflow source releases -## Airflow release signing tool +The Apache Airflow releases are one of the two types: -The release signing tool can be used to create the SHA512/MD5 and ASC files that required for Apache releases. +* Releases of the Apache Airflow package +* Releases of the Backport Providers Packages -### Execution +## Apache Airflow Package -To create a release tarball execute following command from Airflow's root. +This package contains sources that allow the user building fully-functional Apache Airflow 2.0 package. +They contain sources for: -```bash -python setup.py compile_assets sdist --formats=gztar -``` - -*Note: `compile_assets` command build the frontend assets (JS and CSS) files for the -Web UI using webpack and yarn. Please make sure you have `yarn` installed on your local machine globally. -Details on how to install `yarn` can be found in CONTRIBUTING.rst file.* + * "apache-airflow" python package that installs "airflow" Python package and includes + all the assets required to release the webserver UI coming with Apache Airflow + * Dockerfile and corresponding scripts that build and use an official DockerImage + * Breeze development environment that helps with building images and testing locally + apache airflow built from sources -After that navigate to relative directory i.e., `cd dist` and sign the release files. +In the future (Airflow 2.0) this package will be split into separate "core" and "providers" packages that +will be distributed separately, following the mechanisms introduced in Backport Package Providers. We also +plan to release the official Helm Chart sources that will allow the user to install Apache Airflow +via helm 3.0 chart in a distributed fashion. -```bash -../dev/sign.sh <the_created_tar_ball.tar.gz -``` +The Source releases are the only "official" Apache Software Foundation releases, and they are distributed +via [Official Apache Download sources](https://downloads.apache.org/) -Signing files will be created in the same directory. +Following source releases Apache Airflow release manager also distributes convenience packages: +* PyPI packages released via https://pypi.org/project/apache-airflow/ +* Docker Images released via https://hub.docker.com/repository/docker/apache/airflow -# Verifying the release candidate by PMCs (legal) +Those convenience packages are not "official releases" of Apache Airflow, but the users who +cannot or do not want to build the packages themselves can use them as a convenient way of installing +Apache Airflow, however they are not considered as "official source releases". You can read more +details about it in the [ASF Release Policy](http://www.apache.org/legal/release-policy.html). -## PMC voting +This document describes the process of releasing both - official source packages and convenience +packages for Apache Airflow packages. -The PMCs should verify the releases in order to make sure the release is following the -[Apache Legal Release Policy](http://www.apache.org/legal/release-policy.html). +## Backport Provider packages -At least 3 (+1) votes should be recorded in accordance to -[Votes on Package Releases](https://www.apache.org/foundation/voting.html#ReleaseVotes) +The Backport Provider packages are packages (per provider) that make it possible to easily use Hooks, +Operators, Sensors, and Secrets from the 2.0 version of Airflow in the 1.10.* series. -The legal checks include: +Once you release the packages, you can simply install them with: -* checking if the packages are present in the right dist folder on svn -* verifying if all the sources have correct licences -* verifying if release manager signed the releases with the right key -* verifying if all the checksums are valid for the release +``` +pip install apache-airflow-backport-providers-<PROVIDER>[<EXTRAS>] +``` -## SVN check +Where `<PROVIDER>` is the provider id and `<EXTRAS>` are optional extra packages to install. +You can find the provider packages dependencies and extras in the README.md files in each provider +package (in `airflow/providers/<PROVIDER>` folder) as well as in the PyPI installation page. -The files should be present in the sub-folder of -[Airflow dist](https://dist.apache.org/repos/dist/dev/airflow/) +Backport providers are a great way to migrate your DAGs to Airflow-2.0 compatible DAGs. You can +switch to the new Airflow-2.0 packages in your DAGs, long before you attempt to migrate +airflow to 2.0 line. -The following files should be present (9 files): +The sources released in SVN allow to build all the provider packages by the user, following the +instructions and scripts provided. Those are also "official_source releases" as described in the +[ASF Release Policy](http://www.apache.org/legal/release-policy.html) and they are available +via [Official Apache Download sources]https://downloads.apache.org/airflow/backport-providers/. Review comment: ```suggestion via [Official Apache Download sources](https://downloads.apache.org/airflow/backport-providers/). ``` ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org