This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
     new b7a415e2233 Update release notes instructions after 2025-11-14 release 
(#58410)
b7a415e2233 is described below

commit b7a415e2233f3844dde426ac72767ab5eb73244a
Author: Jarek Potiuk <[email protected]>
AuthorDate: Mon Nov 17 23:30:56 2025 +0100

    Update release notes instructions after 2025-11-14 release (#58410)
---
 .github/workflows/publish-docs-to-s3.yml           |  2 +-
 dev/README_RELEASE_PROVIDERS.md                    | 51 ++++++++++++++--------
 .../src/airflow_breeze/utils/publish_docs_to_s3.py |  2 +-
 dev/breeze/tests/test_publish_docs_to_s3.py        |  6 +--
 4 files changed, 39 insertions(+), 22 deletions(-)

diff --git a/.github/workflows/publish-docs-to-s3.yml 
b/.github/workflows/publish-docs-to-s3.yml
index 9ca30c924e8..1a9a399b81c 100644
--- a/.github/workflows/publish-docs-to-s3.yml
+++ b/.github/workflows/publish-docs-to-s3.yml
@@ -372,4 +372,4 @@ jobs:
         run: |
           breeze release-management publish-docs-to-s3 --source-dir-path 
${SOURCE_DIR_PATH} \
           --destination-location ${DESTINATION_LOCATION} --stable-versions \
-          --exclude-docs ${EXCLUDE_DOCS} --overwrite 
${SKIP_WRITE_TO_STABLE_FOLDER}
+          --exclude-docs "${EXCLUDE_DOCS}" --overwrite 
${SKIP_WRITE_TO_STABLE_FOLDER}
diff --git a/dev/README_RELEASE_PROVIDERS.md b/dev/README_RELEASE_PROVIDERS.md
index f04e8e80573..70a14233f76 100644
--- a/dev/README_RELEASE_PROVIDERS.md
+++ b/dev/README_RELEASE_PROVIDERS.md
@@ -1211,6 +1211,12 @@ example `git checkout providers/2025-10-31`
 Note you probably will see message `You are in 'detached HEAD' state.`
 This is expected, the RC tag is most likely behind the main branch.
 
+* Remove source artifact:
+
+```shell script
+rm dist/apache_airflow_providers-${RELEASE_DATE}-source.tar.gz
+```
+
 * Verify the artifacts that would be uploaded:
 
 ```shell script
@@ -1244,13 +1250,6 @@ If you want to disable this behaviour, set the env 
**CLEAN_LOCAL_TAGS** to false
 breeze release-management tag-providers
 ```
 
-The command should output all the tags it created. At the end it should also 
print the general tag
-applied for this provider's release wave with the date of release preparation 
in the format of:
-
-```
-providers/2025-11-03
-```
-
 ## Publish documentation
 
 Documentation is an essential part of the product and should be made available 
to users.
@@ -1267,15 +1266,31 @@ You usually use the `breeze` command to publish the 
documentation. The command d
 2. Triggers workflow in apache/airflow-site to refresh
 3. Triggers S3 to GitHub Sync
 
+First - unset GITHUB_TOKEN if you have it set, this workflows reads token from 
your github repository
+configuration if you login with `gh`, do it only once - you do not have repeat 
it afterwards.
+
+```shell script
+unset GITHUB_TOKEN
+brew install gh
+gh auth login
+```
+
+Run workflows:
+
 ```shell script
-  unset GITHUB_TOKEN
   breeze workflow-run publish-docs --ref providers/${RELEASE_DATE} --site-env 
live all-providers
 ```
 
+If you need to exclude some providers from the documentation you need to add 
`--exclude-providers` flag
+with space separated list of excluded providers.
+
+```shell script
+  breeze workflow-run publish-docs --ref providers/${RELEASE_DATE} --site-env 
live all-providers --exclude-docs "apprise slack"
+```
+
 Or if you just want to publish a few selected providers, you can run:
 
 ```shell script
-  unset GITHUB_TOKEN
   breeze workflow-run publish-docs --ref providers/${RELEASE_DATE} --site-env 
live PACKAGE1 PACKAGE2 ..
 ```
 
@@ -1289,6 +1304,8 @@ not be needed unless there is some problem with workflow 
automation above)
 
 ## Update providers metadata
 
+Create PR and open it to be merged:
+
 ```shell script
 cd ${AIRFLOW_REPO_ROOT}
 git checkout main
@@ -1299,11 +1316,9 @@ git checkout -b "${branch}"
 breeze release-management generate-providers-metadata 
--refresh-constraints-and-airflow-releases
 git add -p .
 git commit -m "Update providers metadata ${current_date}"
-git push --set-upstream origin "${branch}"
+gh pr create --title "Update providers metadata ${current_date}" --web
 ```
 
-Create PR and get it merged
-
 ## Notify developers of release
 
 Notify [email protected] (cc'ing [email protected]) that
@@ -1324,15 +1339,16 @@ cat <<EOF
 Dear Airflow community,
 
 I'm happy to announce that new versions of Airflow Providers packages prepared 
on ${RELEASE_DATE} were just released.
+
 Full list of PyPI packages released is added at the end of the message.
 
 The source release, as well as the binary releases, are available here:
 
-https://airflow.apache.org/docs/apache-airflow-providers/installing-from-sources
+https://airflow.apache.org/docs/apache-airflow-providers/installing-from-sources.html
 
-You can install the providers via PyPI: 
https://airflow.apache.org/docs/apache-airflow-providers/installing-from-pypi
+You can install the providers via PyPI: 
https://airflow.apache.org/docs/apache-airflow-providers/installing-from-pypi.html
 
-The documentation is available at https://airflow.apache.org/docs/ and linked 
from the PyPI packages.
+The documentation index is available at https://airflow.apache.org/docs/ and 
documentation for individual provider versions is linked directly from PyPI.
 
 ----
 
@@ -1384,7 +1400,6 @@ Example for special cases:
 
------------------------------------------------------------------------------------------------------------
 Announcement is done from official Apache-Airflow accounts.
 
-* X: https://x.com/ApacheAirflow
 * LinkedIn: https://www.linkedin.com/company/apache-airflow/
 * Fosstodon: https://fosstodon.org/@airflow
 * Bluesky: https://bsky.app/profile/apache-airflow.bsky.social
@@ -1396,7 +1411,9 @@ If you don't have access to the account ask a PMC member 
to post.
 
 ## Add release data to Apache Committee Report Helper
 
-Add the release data (version and date) at: 
https://reporter.apache.org/addrelease.html?airflow
+You should get email about it to your account that should urge you to add it, 
but in
+case you don't, you can add it manually:
+add the release data (version and date) at: 
https://reporter.apache.org/addrelease.html?airflow
 
 ## Close the testing status issue
 
diff --git a/dev/breeze/src/airflow_breeze/utils/publish_docs_to_s3.py 
b/dev/breeze/src/airflow_breeze/utils/publish_docs_to_s3.py
index 07e3e760d81..c7b396bee78 100644
--- a/dev/breeze/src/airflow_breeze/utils/publish_docs_to_s3.py
+++ b/dev/breeze/src/airflow_breeze/utils/publish_docs_to_s3.py
@@ -79,7 +79,7 @@ class S3DocsPublish:
     def get_all_excluded_docs(self):
         if not self.exclude_docs:
             return []
-        excluded_docs = self.exclude_docs.split(",")
+        excluded_docs = self.exclude_docs.split(" ")
 
         # We remove `no-docs-excluded` string, this will be send from github 
workflows input as default value.
         if "no-docs-excluded" in excluded_docs:
diff --git a/dev/breeze/tests/test_publish_docs_to_s3.py 
b/dev/breeze/tests/test_publish_docs_to_s3.py
index 0e9fad9a660..f0d419a43e9 100644
--- a/dev/breeze/tests/test_publish_docs_to_s3.py
+++ b/dev/breeze/tests/test_publish_docs_to_s3.py
@@ -46,7 +46,7 @@ class TestPublishDocsToS3:
                 self.publish_docs_to_s3.get_all_docs()
 
     def test_get_all_excluded_docs(self):
-        self.publish_docs_to_s3.exclude_docs = "amazon,google,apache-airflow"
+        self.publish_docs_to_s3.exclude_docs = "amazon google apache-airflow"
         assert self.publish_docs_to_s3.get_all_excluded_docs == ["amazon", 
"google", "apache-airflow"]
 
     @patch("os.listdir")
@@ -62,7 +62,7 @@ class TestPublishDocsToS3:
             "apache-airflow-ctl",
         ]
 
-        self.publish_docs_to_s3.exclude_docs = 
"amazon,docker-stack,apache.kafka"
+        self.publish_docs_to_s3.exclude_docs = "amazon docker-stack 
apache.kafka"
 
         assert sorted(self.publish_docs_to_s3.get_all_eligible_docs) == sorted(
             [
@@ -81,7 +81,7 @@ class TestPublishDocsToS3:
             "apache-airflow",
             "apache-airflow-providers-apache-kafka",
         ]
-        self.publish_docs_to_s3.exclude_docs = 
"amazon,apache-airflow,apache.kafka"
+        self.publish_docs_to_s3.exclude_docs = "amazon apache-airflow 
apache.kafka"
 
         with pytest.raises(SystemExit):
             self.publish_docs_to_s3.get_all_eligible_docs

Reply via email to