Lee-W commented on code in PR #51980:
URL: https://github.com/apache/airflow/pull/51980#discussion_r2159872216


##########
docs/README.md:
##########
@@ -164,8 +164,8 @@ This workflow also invalidates cache in Fastly that Apache 
Software Foundation u
 so you should always run it after you modify the documentation for the 
website. Other than that Fastly is
 configured with 3600 seconds TTL - which means that changes will propagate to 
the website in ~1 hour.
 
-Shortly after the workflow succeeds and documentation is published, in live 
bucket, the `airflow-site-archive`
-repository is automatically synchronized with the live S3 bucket. TODO: 
IMPLEMENT THIS, FOR NOW IT HAS
+Shortly after the workflow succeeds and documentation is published, in the 
live bucket, the `airflow-site-archive`
+repository is automatically synchronized with the live S3 bucket. TO-DO: 
IMPLEMENT THIS, FOR NOW IT HAS

Review Comment:
   ```suggestion
   repository is automatically synchronized with the live S3 bucket. TODO: 
IMPLEMENT THIS, FOR NOW IT HAS
   ```
   
   TODO is something that can usually be highlighted by the editor. let's keep 
it



##########
docs/README.md:
##########
@@ -193,34 +193,33 @@ The version of sphinx theme is fixed in both repositories:
 * 
https://github.com/apache/airflow-site/blob/main/sphinx_airflow_theme/sphinx_airflow_theme/__init__.py#L21
 * https://github.com/apache/airflow/blob/main/devel-common/pyproject.toml#L77 
in "docs" section
 
-In case of bigger changes to the theme, we
-can first iterate on the website and merge a new theme version, and only after 
that we can switch to the new
+In case of bigger changes to the theme, we can first iterate on the website 
and merge a new theme version, and only after that can we switch to the new

Review Comment:
   we don't want the line to be super long



##########
docs/README.md:
##########
@@ -229,22 +228,22 @@ bad links or when we change some of the structure in the 
documentation. This can
 
 ## Manually publishing documentation directly to S3
 
-The regular publishing workflows involve running Github Actions workflow and 
they cover majority of cases,
-however sometimes some manual updates and cherry-picks are needed, when we 
discover problems with the
-publishing and doc building code - for example when we find that we need to 
fix extensions to sphinx.
+The regular publishing workflows involve running a GitHub Actions workflow, 
and they cover the majority of cases.
+However, sometimes, some manual updates and cherry-picks are needed, when we 
discover problems with the
+publishing and doc building code - for example, when we find that we need to 
fix extensions to sphinx.
 
-In such case, release manager or a committer can build and publish 
documentation locally - providing that
+In such a case, the release manager or a committer can build and publish 
documentation locally - providing that
 they configure AWS credentials to be able to upload files to S3. You can ask in
 the #internal-airflow-ci-cd channel on Airflow Slack to get your AWS 
credentials configured.
 
-You can checkout locally a version of airflow repo that you need and apply any 
cherry-picks you need before
+You can check out locally a version of airflow repo that you need and apply 
any cherry-picks you need before

Review Comment:
   ```suggestion
   You can checkout locally a version of airflow repo that you need and apply 
any cherry-picks you need before
   ```
   
   I think it stands for `git checkout`. It's better to keep it this way



##########
docs/README.md:
##########
@@ -38,44 +38,44 @@
 # Documentation configuration
 
 This directory used to contain all the documentation files for the project. 
The documentation has
-been split to separate folders - the documentation is now in the folders in 
sub-projects that they
+been split into separate folders - the documentation is now in the folders in 
sub-projects that they
 are referring to.
 
-If you look for the documentation it is stored as follows:
+If you look for the documentation, it is stored as follows:
 
 Documentation in separate distributions:
 
 * `airflow-core/docs` - documentation for Airflow Core
 * `providers/**/docs` - documentation for Providers
-* `chart/docs` - documentation for Helm Chart
+* `chart/docs` - documentation for the Helm Chart
 * `task-sdk/docs` - documentation for Task SDK (new format not yet published)
 * `airflow-ctl/docs` - documentation for Airflow CLI (future)
 
-Documentation for general overview and summaries not connected with any 
specific distribution:
+Documentation for a general overview and summaries not connected with any 
specific distribution:
 
 * `docker-stack-docs` - documentation for Docker Stack'
-* `providers-summary-docs` - documentation for provider summary page
+* `providers-summary-docs` - documentation for the provider summary page
 
 # Architecture of documentation for Airflow
 
-Building documentation for Airflow is optimized for speed and for convenience 
workflows of the release
-managers and committers who publish and fix the documentation - that's why 
it's a little complex, as we have
-multiple repositories and multiple sources of the documentation involved.
+Building documentation for Airflow is optimized for speed and for the 
convenience workflows of the release
+managers and committers who publish and fix the documentation - that's why 
it's a little complex, as we have 
+multiple repositories and multiple sources of documentation involved.
 
-There are few repositories under `apache` organization which are used to build 
the documentation for Airflow:
+There are a few repositories under `apache` organization that are used to 
build the documentation for Airflow:
 
 * `apache-airflow` - the repository with the code and the documentation 
sources for Airflow distributions,
-   provider distributions, providers summary and docker summary: 
[apache-airflow](https://github.com/apache/airflow)
-   from here we publish the documentation to S3 bucket where the documentation 
is hosted.
+   provider distributions, provider's summary, and docker summary: 
[apache-airflow](https://github.com/apache/airflow).

Review Comment:
   ```suggestion
      provider distributions, providers' summary, and docker summary: 
[apache-airflow](https://github.com/apache/airflow).
   ```



##########
docs/README.md:
##########
@@ -38,44 +38,44 @@
 # Documentation configuration
 
 This directory used to contain all the documentation files for the project. 
The documentation has
-been split to separate folders - the documentation is now in the folders in 
sub-projects that they
+been split into separate folders - the documentation is now in the folders in 
sub-projects that they
 are referring to.
 
-If you look for the documentation it is stored as follows:
+If you look for the documentation, it is stored as follows:
 
 Documentation in separate distributions:
 
 * `airflow-core/docs` - documentation for Airflow Core
 * `providers/**/docs` - documentation for Providers
-* `chart/docs` - documentation for Helm Chart
+* `chart/docs` - documentation for the Helm Chart
 * `task-sdk/docs` - documentation for Task SDK (new format not yet published)
 * `airflow-ctl/docs` - documentation for Airflow CLI (future)
 
-Documentation for general overview and summaries not connected with any 
specific distribution:
+Documentation for a general overview and summaries not connected with any 
specific distribution:
 
 * `docker-stack-docs` - documentation for Docker Stack'
-* `providers-summary-docs` - documentation for provider summary page
+* `providers-summary-docs` - documentation for the provider summary page
 
 # Architecture of documentation for Airflow
 
-Building documentation for Airflow is optimized for speed and for convenience 
workflows of the release
-managers and committers who publish and fix the documentation - that's why 
it's a little complex, as we have
-multiple repositories and multiple sources of the documentation involved.
+Building documentation for Airflow is optimized for speed and for the 
convenience workflows of the release
+managers and committers who publish and fix the documentation - that's why 
it's a little complex, as we have 
+multiple repositories and multiple sources of documentation involved.
 
-There are few repositories under `apache` organization which are used to build 
the documentation for Airflow:
+There are a few repositories under `apache` organization that are used to 
build the documentation for Airflow:
 
 * `apache-airflow` - the repository with the code and the documentation 
sources for Airflow distributions,
-   provider distributions, providers summary and docker summary: 
[apache-airflow](https://github.com/apache/airflow)
-   from here we publish the documentation to S3 bucket where the documentation 
is hosted.
+   provider distributions, provider's summary, and docker summary: 
[apache-airflow](https://github.com/apache/airflow).
+   From here, we publish the documentation to an S3 bucket where the 
documentation is hosted.
 * `airflow-site` - the repository with the website theme and content where we 
keep sources of the website
-   structure, navigation, theme for the website 
[airflow-site](https://github.com/apache/airflow). From here
-   we publish the website to the ASF servers so they are publish as the 
[official website](https://airflow.apache.org)
+   structure, navigation, and theme for the website 
[airflow-site](https://github.com/apache/airflow). From here,

Review Comment:
   ```suggestion
      structure, navigation, and theme for the website 
[airflow-site](https://github.com/apache/airflow-site). From here,
   ```
   
   hmmm... just notice the link for it is wrong 🤔



##########
docs/README.md:
##########
@@ -164,8 +164,8 @@ This workflow also invalidates cache in Fastly that Apache 
Software Foundation u
 so you should always run it after you modify the documentation for the 
website. Other than that Fastly is
 configured with 3600 seconds TTL - which means that changes will propagate to 
the website in ~1 hour.
 
-Shortly after the workflow succeeds and documentation is published, in live 
bucket, the `airflow-site-archive`
-repository is automatically synchronized with the live S3 bucket. TODO: 
IMPLEMENT THIS, FOR NOW IT HAS
+Shortly after the workflow succeeds and documentation is published, in the 
live bucket, the `airflow-site-archive`

Review Comment:
   ```suggestion
   Shortly after the workflow succeeds and documentation is published, in the 
live bucket, the 
[airflow-site-archive](https://github.com/apache/airflow-site-archive)
   ```



##########
docs/README.md:
##########
@@ -193,34 +193,33 @@ The version of sphinx theme is fixed in both repositories:
 * 
https://github.com/apache/airflow-site/blob/main/sphinx_airflow_theme/sphinx_airflow_theme/__init__.py#L21
 * https://github.com/apache/airflow/blob/main/devel-common/pyproject.toml#L77 
in "docs" section
 
-In case of bigger changes to the theme, we
-can first iterate on the website and merge a new theme version, and only after 
that we can switch to the new
+In case of bigger changes to the theme, we can first iterate on the website 
and merge a new theme version, and only after that can we switch to the new
 version of the theme.
 
 
 # Fixing historical documentation
 
-Sometimes we need to update historical documentation (modify generated `html`) 
- for example when we find
+Sometimes we need to update historical documentation (modify generated `html`) 
- for example, when we find
 bad links or when we change some of the structure in the documentation. This 
can be done via the
 `airflow-site-archive` repository. The workflow is as follows:
 
 1. Get the latest version of the documentation from S3 to 
`airflow-site-archive` repository using
-   `Sync s3 to GitHub` workflow. This will download the latest version of the 
documentation from S3 to
-   `airflow-site-archive` repository (this should be normally not needed, if 
automated synchronization works).
+   `Sync S3 to GitHub` workflow. This will download the latest version of the 
documentation from S3 to
+   `airflow-site-archive` repository (this should normally be not needed, if 
automated synchronization works).
 2. Make the changes to the documentation in `airflow-site-archive` repository. 
This can be done using any
-   text editors, scripts etc. Those files are generated as `html` files and 
are not meant to be regenerated,
+   text editor, scripts, etc. Those files are generated as `html` files and 
are not meant to be regenerated,

Review Comment:
   ```suggestion
      text editor, script, etc. Those files are generated as `html` files and 
are not meant to be regenerated,
   ```



##########
docs/README.md:
##########
@@ -193,34 +193,33 @@ The version of sphinx theme is fixed in both repositories:
 * 
https://github.com/apache/airflow-site/blob/main/sphinx_airflow_theme/sphinx_airflow_theme/__init__.py#L21
 * https://github.com/apache/airflow/blob/main/devel-common/pyproject.toml#L77 
in "docs" section
 
-In case of bigger changes to the theme, we
-can first iterate on the website and merge a new theme version, and only after 
that we can switch to the new
+In case of bigger changes to the theme, we can first iterate on the website 
and merge a new theme version, and only after that can we switch to the new

Review Comment:
   ```suggestion
   In case of bigger changes to the theme, we can first iterate on the website 
and merge a new theme version,
   and only after that can we switch to the new
   ```



##########
docs/README.md:
##########
@@ -258,18 +257,18 @@ breeze release-management publish-docs-to-s3 
--source-dir-path /tmp/airflow-site
 
 ## Manually publishing documentation via `apache-airflow-site-archive` repo
 
-If you do not have S3 credentials and want to be careful about publishing the 
documentation you can also
+If you do not have S3 credentials and want to be careful about publishing the 
documentation, you can also
 use publishing via `apache-airflow-site-archive` repository. This is a little 
more complex, but it allows
 you to publish documentation without having S3 credentials.
 
 The process is as follows:
 
 1. Run `Sync s3 to GitHub` workflow in `apache-airflow-site-archive` 
repository. This will download the
-   latest version of the documentation from S3 to `airflow-site-archive` 
repository (this should be normally
+   latest version of the documentation from S3 to `airflow-site-archive` 
repository (this should normally be
    not needed, if automated synchronization works).
-2. Checkout `apache-airflow-site-archive` repository and create a branch for 
your changes.
-3. Build documentation locally in `apache-airflow` repo with any cherry-picks 
and modifications you need and
-   publish the docs to the checked out `airflow-site-archive` branch
+2. Check out the `apache-airflow-site-archive` repository and create a branch 
for your changes.

Review Comment:
   Same here



##########
docs/README.md:
##########
@@ -193,34 +193,33 @@ The version of sphinx theme is fixed in both repositories:
 * 
https://github.com/apache/airflow-site/blob/main/sphinx_airflow_theme/sphinx_airflow_theme/__init__.py#L21
 * https://github.com/apache/airflow/blob/main/devel-common/pyproject.toml#L77 
in "docs" section
 
-In case of bigger changes to the theme, we
-can first iterate on the website and merge a new theme version, and only after 
that we can switch to the new
+In case of bigger changes to the theme, we can first iterate on the website 
and merge a new theme version, and only after that can we switch to the new
 version of the theme.
 
 
 # Fixing historical documentation
 
-Sometimes we need to update historical documentation (modify generated `html`) 
- for example when we find
+Sometimes we need to update historical documentation (modify generated `html`) 
- for example, when we find
 bad links or when we change some of the structure in the documentation. This 
can be done via the
 `airflow-site-archive` repository. The workflow is as follows:
 
 1. Get the latest version of the documentation from S3 to 
`airflow-site-archive` repository using
-   `Sync s3 to GitHub` workflow. This will download the latest version of the 
documentation from S3 to
-   `airflow-site-archive` repository (this should be normally not needed, if 
automated synchronization works).
+   `Sync S3 to GitHub` workflow. This will download the latest version of the 
documentation from S3 to

Review Comment:
   Good catch, it's actually named as `Sync S3 to GitHub` here 
https://github.com/apache/airflow-site-archive/blob/main/.github/workflows/s3-to-github.yml



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to