mik-laj commented on a change in pull request #11310:
URL: https://github.com/apache/airflow/pull/11310#discussion_r501937645



##########
File path: dev/README.md
##########
@@ -567,8 +704,645 @@ 
https://airflow.apache.org/changelog.html#airflow-1-10-2-2019-01-19
 
 Cheers,
 <your name>
-</p>
-</details>
+EOF
+```
+
+### Update Announcements page
+
+Update "Announcements" page at the [Official Airflow 
website](https://airflow.apache.org/announcements/)
+
+
+-----------------------------------------------------------------------------------------------------------
+
+
+# Backport Provider Packages
+
+You can read more about the command line tools used to generate backport 
packages in
+[Backport Providers README](../backport_packages/README.md).
+
+## Decide when to release
+
+You can release backport packages separately on an ad-hoc basis, whenever we 
find that a given provider needs
+to be released - due to new features or due to bug fixes. You can release each 
backport package
+separately.
+
+We are using the [CALVER](https://calver.org/) versioning scheme for the 
backport packages. We also have an
+automated way to prepare and build the packages, so it should be very easy to 
release the packages often and
+separately.
+
+## Prepare the Backport Provider Packages RC
+
+### Generate release notes
+
+Prepare release notes for all the packages you plan to release. Where 
YYYY.MM.DD is the CALVER
+date for the packages.
+
+```
+./breeze prepare-backport-readme YYYY.MM.DD [packages]
+```
+
+If you iterate with merges and release candidates you can update the release 
date without providing
+the date (to update the existing release notes)
+
+```
+./breeze prepare-backport-readme google
+```
+
+Generated readme files should be eventually committed to the repository.
+
+### Build an RC release for SVN apache upload
+
+The Release Candidate artifacts we vote upon should be the exact ones we vote 
against, without any
+modification than renaming i.e. the contents of the files must be the same 
between voted
+release candidate and final release. Because of this the version in the built 
artifacts
+that will become the official Apache releases must not include the rcN suffix. 
They also need
+to be signed and have checksum files. You can generate the checksum/signature 
files by running
+the "dev/sign.sh" script (assuming you have the right PGP key set-up for 
signing). The script
+generates corresponding .asc and .sha512 files for each file to sign.
+
+#### Build and sign the source and convenience packages
+
+* Set environment variables (version and root of airflow repo)
+
+```shell script
+export VERSION=2020.5.20rc2
+export AIRFLOW_REPO_ROOT=$(pwd)
+
+```
+
+* Build the source package:
+
+```
+./backport_packages/build_source_package.sh
+
+```
+
+It will generate `apache-airflow-backport-providers-${VERSION}-source.tar.gz`
+
+* Generate the packages - since we are preparing packages for SVN repo, we 
should use the right switch. Note
+  that this will clean up dist folder before generating the packages, so it 
will only contain the packages
+  you intended to build.
+
+```shell script
+./breeze prepare-backport-packages --version-suffix-for-svn rc1
+```
+
+if you ony build few packages, run:
+
+```shell script
+./breeze prepare-backport-packages --version-suffix-for-svn rc1 PACKAGE 
PACKAGE ....
+```
+
+* Move the source tarball to dist folder
+
+```shell script
+mv apache-airflow-backport-providers-${VERSION}-source.tar.gz dist
+```
+
+* Sign all your packages
+
+```shell script
+pushd dist
+../dev/sign.sh *
+popd
+```
+
+* Push tags to Apache repository (assuming that you have apache remote 
pointing to apache/airflow repo)]
+
+```shell script
+git push apache backport-providers-${VERSION}
+```
+
+#### Commit the source packages to Apache SVN repo
+
+* Push the artifacts to ASF dev dist repo
+
+```shell script
+# First clone the repo if you do not have it
+svn checkout https://dist.apache.org/repos/dist/dev/airflow airflow-dev
+
+# update the repo in case you have it already
+cd airflow-dev
+svn update
+
+# Create a new folder for the release.
+cd airflow-dev/backport-providers
+svn mkdir ${VERSION}
+
+# Move the artifacts to svn folder
+mv ${AIRFLOW_REPO_ROOT}/dist/* ${VERSION}/
+
+# Add and commit
+svn add ${VERSION}/*
+svn commit -m "Add artifacts for Airflow ${VERSION}"
+
+cd ${AIRFLOW_REPO_ROOT}
+```
+
+Verify that the files are available at
+[backport-providers](https://dist.apache.org/repos/dist/dev/airflow/backport-providers/)
+
+### Publish the RC convenience package to PyPI
+
+In order to publish to PyPI you just need to build and release packages. The 
packages should however
+contain the rcN suffix in the version name as well, so you need to use 
`--version-suffix-for-pypi` switch
+to prepare those packages. Note that these are different packages than the 
ones used for SVN upload
+though they should be generated from the same sources.
+
+* Generate the packages with the right RC version (specify the version suffix 
with PyPI switch). Note that
+this will clean up dist folder before generating the packages, so you will 
only have the right packages there.
+
+```shell script
+./breeze prepare-backport-packages --version-suffix-for-pypi rc1
+```
+
+if you ony build few packages, run:
+
+```shell script
+./breeze prepare-backport-packages --version-suffix-for-pypi rc1 PACKAGE 
PACKAGE ....
+```
+
+* Verify the artifacts that would be uploaded:
+
+```shell script
+twine check dist/*
+```
+
+* Upload the package to PyPi's test environment:
+
+```shell script
+twine upload -r pypitest dist/*
+```
+
+* Verify that the test packages look good by downloading it and installing 
them into a virtual environment.
+Twine prints the package links as output - separately for each package.
+
+* Upload the package to PyPi's production environment:
+
+```shell script
+twine upload -r pypi dist/*
+```
+
+* Copy the list of links to the uploaded packages - they will be useful in 
preparing VOTE email.
+
+* Again, confirm that the packages are available under the links printed.
+
+## Vote and verify the Backport Providers release candidate
+
+### Prepare voting email for Backport Providers release candidate
+
+Make sure the packages are in 
https://dist.apache.org/repos/dist/dev/airflow/backport-providers/
+
+Send out a vote to the d...@airflow.apache.org mailing list. Here you can 
prepare text of the
+email using the ${VERSION} variable you already set in the command line.
+
+subject:
+
+
+```shell script
+cat <<EOF
+[VOTE] Airflow Backport Providers ${VERSION}
+EOF
+```
+
+```shell script
+cat <<EOF
+Hey all,
+
+I have cut Airflow Backport Providers ${VERSION}. This email is calling a vote 
on the release,
+which will last for 72 hours - which means that it will end on $(date -d '+3 
days').
+
+Consider this my (binding) +1.
+
+Airflow Backport Providers ${VERSION} are available at:
+https://dist.apache.org/repos/dist/dev/airflow/backport-providers/${VERSION}/
+
+*apache-airflow-backport-providers-${VERSION}-source.tar.gz* is a source 
release that comes
+ with INSTALL instructions.
+
+*apache-airflow-backport-providers-<PROVIDER>-${VERSION}-bin.tar.gz* are the 
binary
+ Python "sdist" release.
+
+The test procedure for PMCs and Contributors who would like to test the RC 
candidates are described in
+https://github.com/apache/airflow/blob/master/dev/README.md#vote-and-verify-the-backport-providers-release-candidate
+
+
+Public keys are available at:
+https://dist.apache.org/repos/dist/release/airflow/KEYS
+
+Please vote accordingly:
+
+[ ] +1 approve
+[ ] +0 no opinion
+[ ] -1 disapprove with the reason
+
+
+Only votes from PMC members are binding, but members of the community are
+encouraged to test the release and vote with "(non-binding)".
+
+Please note that the version number excludes the 'rcX' string, so it's now
+simply ${VERSION%rc?}. This will allow us to rename the artifact without 
modifying
+the artifact checksums when we actually release.
+
+Each of the packages contains detailed changelog. Here is the list of links to
+the released packages and changelogs:
+
+TODO: Paste the result of twine upload
+
+Cheers,
+<TODO: Your Name>
+
+EOF
+```
+
+Due to the nature of backport packages, not all packages have to be released 
as convenience
+packages in the final release. During the voting process
+the voting PMCs might decide to exclude certain packages from the release if 
some critical
+problems have been found in some packages.
+
+Please modify the message above accordingly to clearly exclude those packages.
+
+### Verify the release
+
+#### SVN check
+
+The files should be present in the sub-folder of
+[Airflow 
dist](https://dist.apache.org/repos/dist/dev/airflow/backport-providers/)
+
+The following files should be present (9 files):
+
+* -source.tar.gz + .asc + .sha512 (one set of files)
+* -bin-tar.gz + .asc + .sha512 (one set of files per provider)
+* -.whl + .asc + .sha512 (one set of files per provider)
+
+As a PMC you should be able to clone the SVN repository:
+
+```shell script
+svn co https://dist.apache.org/repos/dist/dev/airflow/
+```
+
+Or update it if you already checked it out:
+
+```shell script
+svn update .
+```
+
+#### Verify the licences
+
+This can be done with the Apache RAT tool.
+
+* Download the latest jar from https://creadur.apache.org/rat/download_rat.cgi 
(unpack the sources,
+  the jar is inside)
+* Unpack the -source.tar.gz to a folder
+* Enter the folder and run the check (point to the place where you extracted 
the .jar)
+
+```shell script
+java -jar ../../apache-rat-0.13/apache-rat-0.13.jar -E .rat-excludes -d .
+```
+
+#### Verify the signatures
+
+Make sure you have the key of person signed imported in your GPG. You can find 
the valid keys in
+[KEYS](https://dist.apache.org/repos/dist/release/airflow/KEYS).
+
+You can import the whole KEYS file:
+
+```shell script
+gpg --import KEYS
+```
+
+You can also import the keys individually from a keyserver. The below one uses 
Kaxil's key and
+retrieves it from the default GPG keyserver
+[OpenPGP.org](https://keys.openpgp.org):
+
+```shell script
+gpg --receive-keys 12717556040EEF2EEAF1B9C275FCCD0A25FA0E4B
+```
+
+You should choose to import the key when asked.
+
+Note that by being default, the OpenPGP server tends to be overloaded often 
and might respond with
+errors or timeouts. Many of the release managers also uploaded their keys to 
the
+[GNUPG.net](https://keys.gnupg.net) keyserver, and you can retrieve it from 
there.
+
+```shell script
+gpg --keyserver keys.gnupg.net --receive-keys 
12717556040EEF2EEAF1B9C275FCCD0A25FA0E4B
+```
+
+Once you have the keys, the signatures can be verified by running this:
+
+```shell script
+for i in *.asc
+do
+   echo "Checking $i"; gpg --verify `basename $i .sha512 `
+done
+```
+
+This should produce results similar to the below. The "Good signature from 
..." is indication
+that the signatures are correct. Do not worry about the "not certified with a 
trusted signature"
+warning. Most of the certificates used by release managers are self signed, 
that's why you get this
+warning. By importing the server in the previous step and importing it via ID 
from
+[KEYS](https://dist.apache.org/repos/dist/release/airflow/KEYS) page, you know 
that
+this is a valid Key already.
+
+```
+Checking apache-airflow-1.10.12rc4-bin.tar.gz.asc
+gpg: assuming signed data in 'apache-airflow-1.10.12rc4-bin.tar.gz'
+gpg: Signature made sob, 22 sie 2020, 20:28:28 CEST
+gpg:                using RSA key 12717556040EEF2EEAF1B9C275FCCD0A25FA0E4B
+gpg: Good signature from "Kaxil Naik <kaxiln...@gmail.com>" [unknown]
+gpg: WARNING: This key is not certified with a trusted signature!
+gpg:          There is no indication that the signature belongs to the owner.
+Primary key fingerprint: 1271 7556 040E EF2E EAF1  B9C2 75FC CD0A 25FA 0E4B
+Checking apache_airflow-1.10.12rc4-py2.py3-none-any.whl.asc
+gpg: assuming signed data in 'apache_airflow-1.10.12rc4-py2.py3-none-any.whl'
+gpg: Signature made sob, 22 sie 2020, 20:28:31 CEST
+gpg:                using RSA key 12717556040EEF2EEAF1B9C275FCCD0A25FA0E4B
+gpg: Good signature from "Kaxil Naik <kaxiln...@gmail.com>" [unknown]
+gpg: WARNING: This key is not certified with a trusted signature!
+gpg:          There is no indication that the signature belongs to the owner.
+Primary key fingerprint: 1271 7556 040E EF2E EAF1  B9C2 75FC CD0A 25FA 0E4B
+Checking apache-airflow-1.10.12rc4-source.tar.gz.asc
+gpg: assuming signed data in 'apache-airflow-1.10.12rc4-source.tar.gz'
+gpg: Signature made sob, 22 sie 2020, 20:28:25 CEST
+gpg:                using RSA key 12717556040EEF2EEAF1B9C275FCCD0A25FA0E4B
+gpg: Good signature from "Kaxil Naik <kaxiln...@gmail.com>" [unknown]
+gpg: WARNING: This key is not certified with a trusted signature!
+gpg:          There is no indication that the signature belongs to the owner.
+Primary key fingerprint: 1271 7556 040E EF2E EAF1  B9C2 75FC CD0A 25FA 0E4B
+```
+
+#### Verify the SHA512 sum
+
+Run this:
+
+```shell script
+for i in *.sha512
+do
+    echo "Checking $i"; gpg --print-md SHA512 `basename $i .sha512 ` | diff - 
$i
+done
+```
+
+You should get output similar to:
+
+```
+Checking apache-airflow-1.10.12rc4-bin.tar.gz.sha512
+Checking apache_airflow-1.10.12rc4-py2.py3-none-any.whl.sha512
+Checking apache-airflow-1.10.12rc4-source.tar.gz.sha512
+```
+
+### Verify if the Backport Packages release candidates "work" by Contributors
+
+This can be done (and we encourage to) by any of the Contributors. In fact, 
it's best if the
+actual users of Apache Airflow test it in their own staging/test 
installations. Each release candidate
+is available on PyPI apart from SVN packages, so everyone should be able to 
install
+the release candidate version of Airflow via simply (<VERSION> is 1.10.12 for 
example, and <X> is
+release candidate number 1,2,3,....).
+
+You have to make sure you have ariflow 1.10.* (the version you want to install 
providers with).
+
+```shell script
+pip install apache-airflow-backport-providers-<provider>==<VERSION>rc<X>
+```
+Optionally it can be followed with constraints
+
+```shell script
+pip install apache-airflow-backport-providers-<provider>==<VERSION>rc<X> \
+  --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-<VERSION>/constraints-3.6.txt"`
+```
+
+Note that the constraints contain python version that you are installing it 
with.
+
+You can use any of the installation methods you prefer (you can even install 
it via the binary wheels
+downloaded from the SVN).
+
+There is also an easy way of installation with Breeze if you have the latest 
sources of Apache Airflow.
+Here is a typical scenario.
+
+First copy all the provider packages .whl files to the `dist` folder.
+
+```shell script
+./breeze start-airflow --install-airflow-version <VERSION>rc<X> \
+    --python 3.7 --backend postgres --instal-wheels
+```
+
+For 1.10 releases you can also use `--no-rbac-ui` flag disable RBAC UI of 
Airflow:
+
+```shell script
+./breeze start-airflow --install-airflow-version <VERSION>rc<X> \
+    --python 3.7 --backend postgres --install-wheels --no-rbac-ui
+```
+

Review comment:
       I propose to add the example below as some people like to have more 
control over their environment. This will allow us to better test the new 
versions in different edge cases and configurations.
   
   > You can also use the official image and PyPI packages to test backport 
packages.  If you need to fully test the integration, sometimes you also have 
to install additional components. Below is Dockerfile, which installs `gcloud`, 
`kubectl` and backport providers for Google and Kubernetes.
   > 
   > ```Dockerfile
   > FROM apache/airflow:1.10.12
   > 
   > RUN BACKPORT_RELEASE=2020.10.5rc1 \
   >     && pip install --user 
"apache-airflow-backport-providers-google==${BACKPORT_RELEASE}" \
   >     && pip install --user pip "install 
apache-airflow-backport-providers-cncf-kubernetes==${BACKPORT_RELEASE}"
   > 
   > RUN curl https://sdk.cloud.google.com | bash \
   >     && echo "source /home/airflow/google-cloud-sdk/path.bash.inc" >> 
/home/airflow/.bashrc \
   >     && echo "source /home/airflow/google-cloud-sdk/completion.bash.inc" >> 
/home/airflow/.bashrc 
   > 
   > USER 0
   > RUN KUBECTL_VERSION="$(curl -s 
https://storage.googleapis.com/kubernetes-release/release/stable.txt)" \
   >     && 
KUBECTL_URL="https://storage.googleapis.com/kubernetes-release/release/${KUBECTL_VERSION}/bin/linux/amd64/kubectl";
 \
   >     && curl -L "${KUBECTL_URL}" --output /usr/local/bin/kubectl \
   >     && chmod +x /usr/local/bin/kubectl
   > 
   > USER ${AIRFLOW_UID}
   > ```
   > Feel free to modify this example to test your use cases.
   > 
   > To build an image build and run a shell, run:
   > ```bash
   > docker build . -t my-airflow
   > docker run  -ti \
   >    --rm \
   >    -v "$PWD/data:/opt/airflow/" \
   >    -v "$PWD/keys/:/keys/" \
   >    -p 8080:8080 \
   >.     -e GOOGLE_APPLICATION_CREDENTIALS=/keys/sa.json \
   >    -e AIRFLOW__CORE__LOAD_EXAMPLES=True \
   >    my-airflow
   > ```

##########
File path: dev/README.md
##########
@@ -567,8 +704,645 @@ 
https://airflow.apache.org/changelog.html#airflow-1-10-2-2019-01-19
 
 Cheers,
 <your name>
-</p>
-</details>
+EOF
+```
+
+### Update Announcements page
+
+Update "Announcements" page at the [Official Airflow 
website](https://airflow.apache.org/announcements/)
+
+
+-----------------------------------------------------------------------------------------------------------
+
+
+# Backport Provider Packages
+
+You can read more about the command line tools used to generate backport 
packages in
+[Backport Providers README](../backport_packages/README.md).
+
+## Decide when to release
+
+You can release backport packages separately on an ad-hoc basis, whenever we 
find that a given provider needs
+to be released - due to new features or due to bug fixes. You can release each 
backport package
+separately.
+
+We are using the [CALVER](https://calver.org/) versioning scheme for the 
backport packages. We also have an
+automated way to prepare and build the packages, so it should be very easy to 
release the packages often and
+separately.
+
+## Prepare the Backport Provider Packages RC
+
+### Generate release notes
+
+Prepare release notes for all the packages you plan to release. Where 
YYYY.MM.DD is the CALVER
+date for the packages.
+
+```
+./breeze prepare-backport-readme YYYY.MM.DD [packages]
+```
+
+If you iterate with merges and release candidates you can update the release 
date without providing
+the date (to update the existing release notes)
+
+```
+./breeze prepare-backport-readme google
+```
+
+Generated readme files should be eventually committed to the repository.
+
+### Build an RC release for SVN apache upload
+
+The Release Candidate artifacts we vote upon should be the exact ones we vote 
against, without any
+modification than renaming i.e. the contents of the files must be the same 
between voted
+release candidate and final release. Because of this the version in the built 
artifacts
+that will become the official Apache releases must not include the rcN suffix. 
They also need
+to be signed and have checksum files. You can generate the checksum/signature 
files by running
+the "dev/sign.sh" script (assuming you have the right PGP key set-up for 
signing). The script
+generates corresponding .asc and .sha512 files for each file to sign.
+
+#### Build and sign the source and convenience packages
+
+* Set environment variables (version and root of airflow repo)
+
+```shell script
+export VERSION=2020.5.20rc2
+export AIRFLOW_REPO_ROOT=$(pwd)
+
+```
+
+* Build the source package:
+
+```
+./backport_packages/build_source_package.sh
+
+```
+
+It will generate `apache-airflow-backport-providers-${VERSION}-source.tar.gz`
+
+* Generate the packages - since we are preparing packages for SVN repo, we 
should use the right switch. Note
+  that this will clean up dist folder before generating the packages, so it 
will only contain the packages
+  you intended to build.
+
+```shell script
+./breeze prepare-backport-packages --version-suffix-for-svn rc1
+```
+
+if you ony build few packages, run:
+
+```shell script
+./breeze prepare-backport-packages --version-suffix-for-svn rc1 PACKAGE 
PACKAGE ....
+```
+
+* Move the source tarball to dist folder
+
+```shell script
+mv apache-airflow-backport-providers-${VERSION}-source.tar.gz dist
+```
+
+* Sign all your packages
+
+```shell script
+pushd dist
+../dev/sign.sh *
+popd
+```
+
+* Push tags to Apache repository (assuming that you have apache remote 
pointing to apache/airflow repo)]
+
+```shell script
+git push apache backport-providers-${VERSION}
+```
+
+#### Commit the source packages to Apache SVN repo
+
+* Push the artifacts to ASF dev dist repo
+
+```shell script
+# First clone the repo if you do not have it
+svn checkout https://dist.apache.org/repos/dist/dev/airflow airflow-dev
+
+# update the repo in case you have it already
+cd airflow-dev
+svn update
+
+# Create a new folder for the release.
+cd airflow-dev/backport-providers
+svn mkdir ${VERSION}
+
+# Move the artifacts to svn folder
+mv ${AIRFLOW_REPO_ROOT}/dist/* ${VERSION}/
+
+# Add and commit
+svn add ${VERSION}/*
+svn commit -m "Add artifacts for Airflow ${VERSION}"
+
+cd ${AIRFLOW_REPO_ROOT}
+```
+
+Verify that the files are available at
+[backport-providers](https://dist.apache.org/repos/dist/dev/airflow/backport-providers/)
+
+### Publish the RC convenience package to PyPI
+
+In order to publish to PyPI you just need to build and release packages. The 
packages should however
+contain the rcN suffix in the version name as well, so you need to use 
`--version-suffix-for-pypi` switch
+to prepare those packages. Note that these are different packages than the 
ones used for SVN upload
+though they should be generated from the same sources.
+
+* Generate the packages with the right RC version (specify the version suffix 
with PyPI switch). Note that
+this will clean up dist folder before generating the packages, so you will 
only have the right packages there.
+
+```shell script
+./breeze prepare-backport-packages --version-suffix-for-pypi rc1
+```
+
+if you ony build few packages, run:
+
+```shell script
+./breeze prepare-backport-packages --version-suffix-for-pypi rc1 PACKAGE 
PACKAGE ....
+```
+
+* Verify the artifacts that would be uploaded:
+
+```shell script
+twine check dist/*
+```
+
+* Upload the package to PyPi's test environment:
+
+```shell script
+twine upload -r pypitest dist/*
+```
+
+* Verify that the test packages look good by downloading it and installing 
them into a virtual environment.
+Twine prints the package links as output - separately for each package.
+
+* Upload the package to PyPi's production environment:
+
+```shell script
+twine upload -r pypi dist/*
+```
+
+* Copy the list of links to the uploaded packages - they will be useful in 
preparing VOTE email.
+
+* Again, confirm that the packages are available under the links printed.
+
+## Vote and verify the Backport Providers release candidate
+
+### Prepare voting email for Backport Providers release candidate
+
+Make sure the packages are in 
https://dist.apache.org/repos/dist/dev/airflow/backport-providers/
+
+Send out a vote to the d...@airflow.apache.org mailing list. Here you can 
prepare text of the
+email using the ${VERSION} variable you already set in the command line.
+
+subject:
+
+
+```shell script
+cat <<EOF
+[VOTE] Airflow Backport Providers ${VERSION}
+EOF
+```
+
+```shell script
+cat <<EOF
+Hey all,
+
+I have cut Airflow Backport Providers ${VERSION}. This email is calling a vote 
on the release,
+which will last for 72 hours - which means that it will end on $(date -d '+3 
days').
+
+Consider this my (binding) +1.
+
+Airflow Backport Providers ${VERSION} are available at:
+https://dist.apache.org/repos/dist/dev/airflow/backport-providers/${VERSION}/
+
+*apache-airflow-backport-providers-${VERSION}-source.tar.gz* is a source 
release that comes
+ with INSTALL instructions.
+
+*apache-airflow-backport-providers-<PROVIDER>-${VERSION}-bin.tar.gz* are the 
binary
+ Python "sdist" release.
+
+The test procedure for PMCs and Contributors who would like to test the RC 
candidates are described in
+https://github.com/apache/airflow/blob/master/dev/README.md#vote-and-verify-the-backport-providers-release-candidate
+
+
+Public keys are available at:
+https://dist.apache.org/repos/dist/release/airflow/KEYS
+
+Please vote accordingly:
+
+[ ] +1 approve
+[ ] +0 no opinion
+[ ] -1 disapprove with the reason
+
+
+Only votes from PMC members are binding, but members of the community are
+encouraged to test the release and vote with "(non-binding)".
+
+Please note that the version number excludes the 'rcX' string, so it's now
+simply ${VERSION%rc?}. This will allow us to rename the artifact without 
modifying
+the artifact checksums when we actually release.
+
+Each of the packages contains detailed changelog. Here is the list of links to
+the released packages and changelogs:
+
+TODO: Paste the result of twine upload
+
+Cheers,
+<TODO: Your Name>
+
+EOF
+```
+
+Due to the nature of backport packages, not all packages have to be released 
as convenience
+packages in the final release. During the voting process
+the voting PMCs might decide to exclude certain packages from the release if 
some critical
+problems have been found in some packages.
+
+Please modify the message above accordingly to clearly exclude those packages.
+
+### Verify the release
+
+#### SVN check
+
+The files should be present in the sub-folder of
+[Airflow 
dist](https://dist.apache.org/repos/dist/dev/airflow/backport-providers/)
+
+The following files should be present (9 files):
+
+* -source.tar.gz + .asc + .sha512 (one set of files)
+* -bin-tar.gz + .asc + .sha512 (one set of files per provider)
+* -.whl + .asc + .sha512 (one set of files per provider)
+
+As a PMC you should be able to clone the SVN repository:
+
+```shell script
+svn co https://dist.apache.org/repos/dist/dev/airflow/
+```
+
+Or update it if you already checked it out:
+
+```shell script
+svn update .
+```
+
+#### Verify the licences
+
+This can be done with the Apache RAT tool.
+
+* Download the latest jar from https://creadur.apache.org/rat/download_rat.cgi 
(unpack the sources,
+  the jar is inside)
+* Unpack the -source.tar.gz to a folder
+* Enter the folder and run the check (point to the place where you extracted 
the .jar)
+
+```shell script
+java -jar ../../apache-rat-0.13/apache-rat-0.13.jar -E .rat-excludes -d .
+```
+
+#### Verify the signatures
+
+Make sure you have the key of person signed imported in your GPG. You can find 
the valid keys in
+[KEYS](https://dist.apache.org/repos/dist/release/airflow/KEYS).
+
+You can import the whole KEYS file:
+
+```shell script
+gpg --import KEYS
+```
+
+You can also import the keys individually from a keyserver. The below one uses 
Kaxil's key and
+retrieves it from the default GPG keyserver
+[OpenPGP.org](https://keys.openpgp.org):
+
+```shell script
+gpg --receive-keys 12717556040EEF2EEAF1B9C275FCCD0A25FA0E4B
+```
+
+You should choose to import the key when asked.
+
+Note that by being default, the OpenPGP server tends to be overloaded often 
and might respond with
+errors or timeouts. Many of the release managers also uploaded their keys to 
the
+[GNUPG.net](https://keys.gnupg.net) keyserver, and you can retrieve it from 
there.
+
+```shell script
+gpg --keyserver keys.gnupg.net --receive-keys 
12717556040EEF2EEAF1B9C275FCCD0A25FA0E4B
+```
+
+Once you have the keys, the signatures can be verified by running this:
+
+```shell script
+for i in *.asc
+do
+   echo "Checking $i"; gpg --verify `basename $i .sha512 `
+done
+```
+
+This should produce results similar to the below. The "Good signature from 
..." is indication
+that the signatures are correct. Do not worry about the "not certified with a 
trusted signature"
+warning. Most of the certificates used by release managers are self signed, 
that's why you get this
+warning. By importing the server in the previous step and importing it via ID 
from
+[KEYS](https://dist.apache.org/repos/dist/release/airflow/KEYS) page, you know 
that
+this is a valid Key already.
+
+```
+Checking apache-airflow-1.10.12rc4-bin.tar.gz.asc
+gpg: assuming signed data in 'apache-airflow-1.10.12rc4-bin.tar.gz'
+gpg: Signature made sob, 22 sie 2020, 20:28:28 CEST
+gpg:                using RSA key 12717556040EEF2EEAF1B9C275FCCD0A25FA0E4B
+gpg: Good signature from "Kaxil Naik <kaxiln...@gmail.com>" [unknown]
+gpg: WARNING: This key is not certified with a trusted signature!
+gpg:          There is no indication that the signature belongs to the owner.
+Primary key fingerprint: 1271 7556 040E EF2E EAF1  B9C2 75FC CD0A 25FA 0E4B
+Checking apache_airflow-1.10.12rc4-py2.py3-none-any.whl.asc
+gpg: assuming signed data in 'apache_airflow-1.10.12rc4-py2.py3-none-any.whl'
+gpg: Signature made sob, 22 sie 2020, 20:28:31 CEST
+gpg:                using RSA key 12717556040EEF2EEAF1B9C275FCCD0A25FA0E4B
+gpg: Good signature from "Kaxil Naik <kaxiln...@gmail.com>" [unknown]
+gpg: WARNING: This key is not certified with a trusted signature!
+gpg:          There is no indication that the signature belongs to the owner.
+Primary key fingerprint: 1271 7556 040E EF2E EAF1  B9C2 75FC CD0A 25FA 0E4B
+Checking apache-airflow-1.10.12rc4-source.tar.gz.asc
+gpg: assuming signed data in 'apache-airflow-1.10.12rc4-source.tar.gz'
+gpg: Signature made sob, 22 sie 2020, 20:28:25 CEST
+gpg:                using RSA key 12717556040EEF2EEAF1B9C275FCCD0A25FA0E4B
+gpg: Good signature from "Kaxil Naik <kaxiln...@gmail.com>" [unknown]
+gpg: WARNING: This key is not certified with a trusted signature!
+gpg:          There is no indication that the signature belongs to the owner.
+Primary key fingerprint: 1271 7556 040E EF2E EAF1  B9C2 75FC CD0A 25FA 0E4B
+```
+
+#### Verify the SHA512 sum
+
+Run this:
+
+```shell script
+for i in *.sha512
+do
+    echo "Checking $i"; gpg --print-md SHA512 `basename $i .sha512 ` | diff - 
$i
+done
+```
+
+You should get output similar to:
+
+```
+Checking apache-airflow-1.10.12rc4-bin.tar.gz.sha512
+Checking apache_airflow-1.10.12rc4-py2.py3-none-any.whl.sha512
+Checking apache-airflow-1.10.12rc4-source.tar.gz.sha512
+```
+
+### Verify if the Backport Packages release candidates "work" by Contributors
+
+This can be done (and we encourage to) by any of the Contributors. In fact, 
it's best if the
+actual users of Apache Airflow test it in their own staging/test 
installations. Each release candidate
+is available on PyPI apart from SVN packages, so everyone should be able to 
install
+the release candidate version of Airflow via simply (<VERSION> is 1.10.12 for 
example, and <X> is
+release candidate number 1,2,3,....).
+
+You have to make sure you have ariflow 1.10.* (the version you want to install 
providers with).
+
+```shell script
+pip install apache-airflow-backport-providers-<provider>==<VERSION>rc<X>
+```
+Optionally it can be followed with constraints
+
+```shell script
+pip install apache-airflow-backport-providers-<provider>==<VERSION>rc<X> \
+  --constraint 
"https://raw.githubusercontent.com/apache/airflow/constraints-<VERSION>/constraints-3.6.txt"`
+```
+
+Note that the constraints contain python version that you are installing it 
with.
+
+You can use any of the installation methods you prefer (you can even install 
it via the binary wheels
+downloaded from the SVN).
+
+There is also an easy way of installation with Breeze if you have the latest 
sources of Apache Airflow.
+Here is a typical scenario.
+
+First copy all the provider packages .whl files to the `dist` folder.
+
+```shell script
+./breeze start-airflow --install-airflow-version <VERSION>rc<X> \
+    --python 3.7 --backend postgres --instal-wheels
+```
+
+For 1.10 releases you can also use `--no-rbac-ui` flag disable RBAC UI of 
Airflow:
+
+```shell script
+./breeze start-airflow --install-airflow-version <VERSION>rc<X> \
+    --python 3.7 --backend postgres --install-wheels --no-rbac-ui
+```
+

Review comment:
       I propose to add the example below as some people like to have more 
control over their environment. This will allow us to better test the new 
versions in different edge cases and configurations.
   
   > You can also use the official image and PyPI packages to test backport 
packages.  If you need to fully test the integration, sometimes you also have 
to install additional components. Below is Dockerfile, which installs `gcloud`, 
`kubectl` and backport providers for Google and Kubernetes.
   > 
   > ```Dockerfile
   > FROM apache/airflow:1.10.12
   > 
   > RUN BACKPORT_RELEASE=2020.10.5rc1 \
   >     && pip install --user 
"apache-airflow-backport-providers-google==${BACKPORT_RELEASE}" \
   >     && pip install --user pip "install 
apache-airflow-backport-providers-cncf-kubernetes==${BACKPORT_RELEASE}"
   > 
   > RUN curl https://sdk.cloud.google.com | bash \
   >     && echo "source /home/airflow/google-cloud-sdk/path.bash.inc" >> 
/home/airflow/.bashrc \
   >     && echo "source /home/airflow/google-cloud-sdk/completion.bash.inc" >> 
/home/airflow/.bashrc 
   > 
   > USER 0
   > RUN KUBECTL_VERSION="$(curl -s 
https://storage.googleapis.com/kubernetes-release/release/stable.txt)" \
   >     && 
KUBECTL_URL="https://storage.googleapis.com/kubernetes-release/release/${KUBECTL_VERSION}/bin/linux/amd64/kubectl";
 \
   >     && curl -L "${KUBECTL_URL}" --output /usr/local/bin/kubectl \
   >     && chmod +x /usr/local/bin/kubectl
   > 
   > USER ${AIRFLOW_UID}
   > ```
   > Feel free to modify this example to test your use cases.
   > 
   > To build an image build and run a shell, run:
   > ```bash
   > docker build . -t my-airflow
   > docker run  -ti \
   >    --rm \
   >    -v "$PWD/data:/opt/airflow/" \
   >    -v "$PWD/keys/:/keys/" \
   >    -p 8080:8080 \
   >      -e GOOGLE_APPLICATION_CREDENTIALS=/keys/sa.json \
   >    -e AIRFLOW__CORE__LOAD_EXAMPLES=True \
   >    my-airflow
   > ```




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


Reply via email to