This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git

commit bcc76656844be47f923527c0a6cd1de546655cb4
Author: Jarek Potiuk <ja...@potiuk.com>
AuthorDate: Sat Aug 21 23:10:31 2021 +0200

    Improve discoverability of Provider packages' functionality
    
    The documentation of provider packages was rather disconnected
    from the apache-airlfow documentation. It was hard to find the
    ways how the apache airflow's core extensions are implemented by
    the community managed providers - you needed to know what you were
    looking for, and you could not find links to the summary of the
    core-functionality extended by providers when you were looking at
    the functionality (like logging/secret backends/connections/auth)
    
    This PR inroduces much more comprehensive cross-linking between
    the airflow core functionalithy and the community-managed providers
    that are providing extensions to the core functionality.
---
 .../logging/cloud-watch-task-handlers.rst          |   2 +-
 .../logging/s3-task-handler.rst                    |   2 +-
 .../index.rst                                      |   2 +-
 .../{logging.rst => logging/index.rst}             |   6 +-
 .../redirects.txt                                  |   1 +
 .../logging/gcs.rst                                |   2 +-
 .../logging/index.rst                              |   4 +-
 .../logging/stackdriver.rst                        |   2 +-
 .../index.rst                                      |   2 +-
 .../{logging.rst => logging/index.rst}             |   2 +-
 .../redirects.txt                                  |   1 +
 .../core-extensions/auth-backends.rst}             |  19 +-
 .../core-extensions/connections.rst}               |  22 +-
 .../core-extensions/extra-links.rst}               |  21 +-
 .../core-extensions}/index.rst                     |  10 +-
 .../core-extensions/logging.rst}                   |  15 +-
 .../core-extensions/secrets-backends.rst           |  36 +++
 docs/apache-airflow-providers/index.rst            | 222 ++++++++++++-------
 docs/apache-airflow/concepts/connections.rst       |  10 +
 docs/apache-airflow/concepts/operators.rst         |   9 +-
 docs/apache-airflow/howto/define_extra_link.rst    |   7 +-
 .../logging-monitoring/logging-tasks.rst           |  11 +-
 docs/apache-airflow/operators-and-hooks-ref.rst    |   7 +-
 .../security/secrets/secrets-backend/index.rst     |  23 +-
 docs/build_docs.py                                 |  75 +++++--
 docs/exts/auth_backend.rst.jinja2                  |  27 +++
 docs/exts/connections.rst.jinja2                   |  27 +++
 docs/exts/extra_links.rst.jinja2                   |  27 +++
 docs/exts/logging.rst.jinja2                       |  29 +++
 docs/exts/operators_and_hooks_ref.py               | 246 ++++++++++++++++++---
 docs/exts/secret_backend.rst.jinja2                |  27 +++
 docs/helm-chart/manage-logs.rst                    |   2 +-
 setup.py                                           |   1 +
 33 files changed, 719 insertions(+), 180 deletions(-)

diff --git 
a/docs/apache-airflow-providers-amazon/logging/cloud-watch-task-handlers.rst 
b/docs/apache-airflow-providers-amazon/logging/cloud-watch-task-handlers.rst
index 4d431c7..c576d78 100644
--- a/docs/apache-airflow-providers-amazon/logging/cloud-watch-task-handlers.rst
+++ b/docs/apache-airflow-providers-amazon/logging/cloud-watch-task-handlers.rst
@@ -17,7 +17,7 @@
 
 .. _write-logs-amazon-cloudwatch:
 
-Writing Logs to Amazon Cloudwatch
+Writing logs to Amazon Cloudwatch
 ---------------------------------
 
 Remote logging to Amazon Cloudwatch uses an existing Airflow connection to 
read or write logs. If you
diff --git a/docs/apache-airflow-providers-amazon/logging/s3-task-handler.rst 
b/docs/apache-airflow-providers-amazon/logging/s3-task-handler.rst
index e37f622..bc12088 100644
--- a/docs/apache-airflow-providers-amazon/logging/s3-task-handler.rst
+++ b/docs/apache-airflow-providers-amazon/logging/s3-task-handler.rst
@@ -17,7 +17,7 @@
 
 .. _write-logs-amazon-s3:
 
-Writing Logs to Amazon S3
+Writing logs to Amazon S3
 -------------------------
 
 Remote logging to Amazon S3 uses an existing Airflow connection to read or 
write logs. If you
diff --git a/docs/apache-airflow-providers-elasticsearch/index.rst 
b/docs/apache-airflow-providers-elasticsearch/index.rst
index a5f9799..60d500e 100644
--- a/docs/apache-airflow-providers-elasticsearch/index.rst
+++ b/docs/apache-airflow-providers-elasticsearch/index.rst
@@ -27,7 +27,7 @@ Content
     :caption: Guides
 
     Connection types <connections/elasticsearch>
-    Logging for Tasks <logging>
+    Logging for Tasks <logging/index>
 
 .. toctree::
     :maxdepth: 1
diff --git a/docs/apache-airflow-providers-elasticsearch/logging.rst 
b/docs/apache-airflow-providers-elasticsearch/logging/index.rst
similarity index 98%
rename from docs/apache-airflow-providers-elasticsearch/logging.rst
rename to docs/apache-airflow-providers-elasticsearch/logging/index.rst
index 4ba22bd..e558db5 100644
--- a/docs/apache-airflow-providers-elasticsearch/logging.rst
+++ b/docs/apache-airflow-providers-elasticsearch/logging/index.rst
@@ -17,7 +17,7 @@
 
 .. _write-logs-elasticsearch:
 
-Writing Logs to Elasticsearch
+Writing logs to Elasticsearch
 -----------------------------
 
 Airflow can be configured to read task logs from Elasticsearch and optionally 
write logs to stdout in standard or json format. These logs can later be 
collected and forwarded to the Elasticsearch cluster using tools like fluentd, 
logstash or others.
@@ -64,7 +64,7 @@ To output task logs to stdout in JSON format, the following 
config could be used
 
 .. _write-logs-elasticsearch-tls:
 
-Writing Logs to Elasticsearch over TLS
+Writing logs to Elasticsearch over TLS
 ''''''''''''''''''''''''''''''''''''''
 
 To add custom configurations to ElasticSearch (e.g. turning on ``ssl_verify``, 
adding a custom self-signed
@@ -91,7 +91,7 @@ Elasticsearch External Link
 
 A user can configure Airflow to show a link to an Elasticsearch log viewing 
system (e.g. Kibana).
 
-To enable it, ``airflow.cfg`` must be configured as in the example below. Note 
the required ``{log_id}`` in the URL, when constructing the external link, 
Airflow replaces this parameter with the same ``log_id_template`` used for 
writing logs (see `Writing Logs to Elasticsearch`_).
+To enable it, ``airflow.cfg`` must be configured as in the example below. Note 
the required ``{log_id}`` in the URL, when constructing the external link, 
Airflow replaces this parameter with the same ``log_id_template`` used for 
writing logs (see `Writing logs to Elasticsearch`_).
 
 .. code-block:: ini
 
diff --git a/docs/apache-airflow-providers-elasticsearch/redirects.txt 
b/docs/apache-airflow-providers-elasticsearch/redirects.txt
new file mode 100644
index 0000000..2902a9c
--- /dev/null
+++ b/docs/apache-airflow-providers-elasticsearch/redirects.txt
@@ -0,0 +1 @@
+logging.rst logging/index.rst
diff --git a/docs/apache-airflow-providers-google/logging/gcs.rst 
b/docs/apache-airflow-providers-google/logging/gcs.rst
index 4e6083a..328f41a 100644
--- a/docs/apache-airflow-providers-google/logging/gcs.rst
+++ b/docs/apache-airflow-providers-google/logging/gcs.rst
@@ -17,7 +17,7 @@
 
 .. _write-logs-gcp:
 
-Writing Logs to Google Cloud Storage
+Writing logs to Google Cloud Storage
 ------------------------------------
 
 Remote logging to Google Cloud Storage uses an existing Airflow connection to 
read or write logs. If you
diff --git a/docs/apache-airflow-providers-google/logging/index.rst 
b/docs/apache-airflow-providers-google/logging/index.rst
index 6f54081..9bade06 100644
--- a/docs/apache-airflow-providers-google/logging/index.rst
+++ b/docs/apache-airflow-providers-google/logging/index.rst
@@ -15,8 +15,8 @@
     specific language governing permissions and limitations
     under the License.
 
-Task handlers
--------------
+Writing logs to Google Cloud Platform
+-------------------------------------
 
 .. toctree::
     :maxdepth: 1
diff --git a/docs/apache-airflow-providers-google/logging/stackdriver.rst 
b/docs/apache-airflow-providers-google/logging/stackdriver.rst
index d074934..901c35f 100644
--- a/docs/apache-airflow-providers-google/logging/stackdriver.rst
+++ b/docs/apache-airflow-providers-google/logging/stackdriver.rst
@@ -17,7 +17,7 @@
 
 .. _write-logs-stackdriver:
 
-Writing Logs to Google Stackdriver
+Writing logs to Google Stackdriver
 ----------------------------------
 
 Airflow can be configured to read and write task logs in `Google Stackdriver 
Logging <https://cloud.google.com/logging/>`__.
diff --git a/docs/apache-airflow-providers-microsoft-azure/index.rst 
b/docs/apache-airflow-providers-microsoft-azure/index.rst
index 7215778..f5b73ed 100644
--- a/docs/apache-airflow-providers-microsoft-azure/index.rst
+++ b/docs/apache-airflow-providers-microsoft-azure/index.rst
@@ -29,7 +29,7 @@ Content
     Connection types <connections/index>
     Operators <operators/index>
     Secrets backends <secrets-backends/azure-key-vault>
-    Logging for Tasks <logging>
+    Logging for Tasks <logging/index>
 
 .. toctree::
     :maxdepth: 1
diff --git a/docs/apache-airflow-providers-microsoft-azure/logging.rst 
b/docs/apache-airflow-providers-microsoft-azure/logging/index.rst
similarity index 98%
rename from docs/apache-airflow-providers-microsoft-azure/logging.rst
rename to docs/apache-airflow-providers-microsoft-azure/logging/index.rst
index 3a87eba..766dc69 100644
--- a/docs/apache-airflow-providers-microsoft-azure/logging.rst
+++ b/docs/apache-airflow-providers-microsoft-azure/logging/index.rst
@@ -17,7 +17,7 @@
 
 .. _write-logs-azure:
 
-Writing Logs to Azure Blob Storage
+Writing logs to Azure Blob Storage
 ----------------------------------
 
 Airflow can be configured to read and write task logs in Azure Blob Storage. 
It uses an existing
diff --git a/docs/apache-airflow-providers-microsoft-azure/redirects.txt 
b/docs/apache-airflow-providers-microsoft-azure/redirects.txt
index 067fea0..3ce1176 100644
--- a/docs/apache-airflow-providers-microsoft-azure/redirects.txt
+++ b/docs/apache-airflow-providers-microsoft-azure/redirects.txt
@@ -1,2 +1,3 @@
 connections/index.rst connections/azure.rst
 secrets-backends/index.rst secrets-backends/azure-key-vault-secrets-backend.rst
+logging.rst logging/index.rst
diff --git a/docs/apache-airflow-providers-google/logging/index.rst 
b/docs/apache-airflow-providers/core-extensions/auth-backends.rst
similarity index 56%
copy from docs/apache-airflow-providers-google/logging/index.rst
copy to docs/apache-airflow-providers/core-extensions/auth-backends.rst
index 6f54081..325b0c2 100644
--- a/docs/apache-airflow-providers-google/logging/index.rst
+++ b/docs/apache-airflow-providers/core-extensions/auth-backends.rst
@@ -15,11 +15,20 @@
     specific language governing permissions and limitations
     under the License.
 
-Task handlers
+Auth backends
 -------------
 
-.. toctree::
-    :maxdepth: 1
-    :glob:
+This is a summary of all Apache Airflow Community provided implementations of 
authentication backends
+exposed via community-managed providers.
 
-    *
+Airflow's authentication for web server and API is based on Flask Application 
Builder's authentication
+capabilities. You can read more about those in
+`FAB security docs 
<https://flask-appbuilder.readthedocs.io/en/latest/security.html>`_.
+
+You can also
+take a look at Auth backends available in the core Airflow in 
:doc:`apache-airflow:security/webserver`
+or see those provided by the community-managed providers:
+
+.. airflow-auth-backends::
+   :tags: None
+   :header-separator: "
diff --git a/docs/apache-airflow-providers-google/logging/index.rst 
b/docs/apache-airflow-providers/core-extensions/connections.rst
similarity index 51%
copy from docs/apache-airflow-providers-google/logging/index.rst
copy to docs/apache-airflow-providers/core-extensions/connections.rst
index 6f54081..667e0af 100644
--- a/docs/apache-airflow-providers-google/logging/index.rst
+++ b/docs/apache-airflow-providers/core-extensions/connections.rst
@@ -15,11 +15,21 @@
     specific language governing permissions and limitations
     under the License.
 
-Task handlers
--------------
+Connections
+-----------
 
-.. toctree::
-    :maxdepth: 1
-    :glob:
+This is a summary of all Apache Airflow Community provided implementations of 
connections
+exposed via community-managed providers.
 
-    *
+Airflow can be extended by providers with custom connections. Each provider 
can define their own custom
+connections, that can define their own custom parameters and UI 
customizations/field behaviours for each
+connection, when the connection is managed via Airflow UI. Those connections 
also define connection types,
+that can be used to automatically create Airflow Hooks for specific connection 
types.
+
+The connection management is explained in
+:doc:`apache-airflow:concepts/connections` and you can also see those
+provided by the community-managed providers:
+
+.. airflow-connections::
+   :tags: None
+   :header-separator: "
diff --git a/docs/apache-airflow-providers-google/logging/index.rst 
b/docs/apache-airflow-providers/core-extensions/extra-links.rst
similarity index 56%
copy from docs/apache-airflow-providers-google/logging/index.rst
copy to docs/apache-airflow-providers/core-extensions/extra-links.rst
index 6f54081..884f590 100644
--- a/docs/apache-airflow-providers-google/logging/index.rst
+++ b/docs/apache-airflow-providers/core-extensions/extra-links.rst
@@ -15,11 +15,20 @@
     specific language governing permissions and limitations
     under the License.
 
-Task handlers
--------------
+Extra Links
+-----------
 
-.. toctree::
-    :maxdepth: 1
-    :glob:
+This is a summary of all Apache Airflow Community provided implementations of 
operator extra links
+exposed via community-managed providers.
 
-    *
+Airflow can be extended by providers with custom operator extra links. For 
each operator, you can define
+its own extra links that can redirect users to external systems. The extra 
link buttons
+will be available on the task page.
+
+The operator extra links are explained in
+:doc:`apache-airflow:howto/define_extra_link` and here you can also see the 
extra links
+provided by the community-managed providers:
+
+.. airflow-extra-links::
+   :tags: None
+   :header-separator: "
diff --git a/docs/apache-airflow-providers-google/logging/index.rst 
b/docs/apache-airflow-providers/core-extensions/index.rst
similarity index 78%
copy from docs/apache-airflow-providers-google/logging/index.rst
copy to docs/apache-airflow-providers/core-extensions/index.rst
index 6f54081..63cc7f4 100644
--- a/docs/apache-airflow-providers-google/logging/index.rst
+++ b/docs/apache-airflow-providers/core-extensions/index.rst
@@ -1,3 +1,4 @@
+
  .. Licensed to the Apache Software Foundation (ASF) under one
     or more contributor license agreements.  See the NOTICE file
     distributed with this work for additional information
@@ -15,11 +16,14 @@
     specific language governing permissions and limitations
     under the License.
 
-Task handlers
--------------
+Core Extensions
+===============
+
+Here is a list of extensions of the core functionalities of ``Apache 
Airflow``. They can be used to extend
+core by implementations of Core features, specific to certain providers.
 
 .. toctree::
-    :maxdepth: 1
+    :maxdepth: 2
     :glob:
 
     *
diff --git a/docs/apache-airflow-providers-google/logging/index.rst 
b/docs/apache-airflow-providers/core-extensions/logging.rst
similarity index 66%
copy from docs/apache-airflow-providers-google/logging/index.rst
copy to docs/apache-airflow-providers/core-extensions/logging.rst
index 6f54081..ca26263 100644
--- a/docs/apache-airflow-providers-google/logging/index.rst
+++ b/docs/apache-airflow-providers/core-extensions/logging.rst
@@ -15,11 +15,14 @@
     specific language governing permissions and limitations
     under the License.
 
-Task handlers
--------------
+Writing logs
+------------
 
-.. toctree::
-    :maxdepth: 1
-    :glob:
+This is a summary of all Apache Airflow Community provided implementations of 
writing task logs
+exposed via community-managed providers. You can also see logging options 
available in the core Airflow in
+:doc:`apache-airflow:logging-monitoring/logging-tasks` and here you can see 
those
+provided by the community-managed providers:
 
-    *
+.. airflow-logging::
+   :tags: None
+   :header-separator: "
diff --git a/docs/apache-airflow-providers/core-extensions/secrets-backends.rst 
b/docs/apache-airflow-providers/core-extensions/secrets-backends.rst
new file mode 100644
index 0000000..26ee3ce
--- /dev/null
+++ b/docs/apache-airflow-providers/core-extensions/secrets-backends.rst
@@ -0,0 +1,36 @@
+ .. Licensed to the Apache Software Foundation (ASF) under one
+    or more contributor license agreements.  See the NOTICE file
+    distributed with this work for additional information
+    regarding copyright ownership.  The ASF licenses this file
+    to you under the Apache License, Version 2.0 (the
+    "License"); you may not use this file except in compliance
+    with the License.  You may obtain a copy of the License at
+
+ ..   http://www.apache.org/licenses/LICENSE-2.0
+
+ .. Unless required by applicable law or agreed to in writing,
+    software distributed under the License is distributed on an
+    "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+    KIND, either express or implied.  See the License for the
+    specific language governing permissions and limitations
+    under the License.
+
+Secret backends
+---------------
+
+This is a summary of all Apache Airflow Community provided implementations of 
secret backends
+exposed via community-managed providers.
+
+Airflow has the capability of reading connections, variables and configuration 
from Secret Backends rather
+than from its own Database. While storing such information in Airflow's 
database is possible, many of the
+enterprise customers already have some secret managers storing secrets, and 
Airflow can tap into those
+via providers that implement secrets backends for services Airflow integrates 
with.
+
+You can also take a
+look at Secret backends available in the core Airflow in
+:doc:`apache-airflow:security/secrets/secrets-backend/index` and here you can 
see the ones
+provided by the community-managed providers:
+
+.. airflow-secrets-backends::
+   :tags: None
+   :header-separator: "
diff --git a/docs/apache-airflow-providers/index.rst 
b/docs/apache-airflow-providers/index.rst
index 99ac4b7..f0c3285 100644
--- a/docs/apache-airflow-providers/index.rst
+++ b/docs/apache-airflow-providers/index.rst
@@ -19,54 +19,91 @@
 Provider packages
 -----------------
 
-.. contents:: :local:
+Apache Airflow 2 is built in modular way. The "Core" of Apache Airflow 
provides core scheduler
+functionality which allow you to write some basic tasks, but the capabilities 
of Apache Airflow can
+be extended by installing additional packages, called ``providers``.
 
-.. _providers:community-maintained-providers:
-
-Community maintained providers
-''''''''''''''''''''''''''''''
+Providers can contain operators, hooks, sensor, and transfer operators to 
communicate with a
+multitude of external systems, but they can also extend Airflow core with new 
capabilities.
 
-Unlike Apache Airflow 1.10, the Airflow 2.0 is delivered in multiple, separate 
but connected packages.
-The core of Airflow scheduling system is delivered as ``apache-airflow`` 
package and there are around
-60 providers packages which can be installed separately as so called "Airflow 
Provider packages".
-Those provider packages are separated per-provider (for example ``amazon``, 
``google``, ``salesforce``
-etc.). Those packages are available as ``apache-airflow-providers`` packages - 
separately per each provider
-(for example there is an ``apache-airflow-providers-amazon`` or 
``apache-airflow-providers-google`` package).
+You can install those provider packages separately in order to interface with 
a given service. The providers
+for ``Apache Airflow`` are designed in the way that you can write your own 
providers easily. The
+``Apache Airflow Community`` develops and maintain more than 60 provider 
packages, but you are free to
+develop your own providers - the providers you build have exactly the same 
capability as the providers
+written by the community, so you can release and share those providers with 
others.
 
 The full list of community managed providers is available at
 `Providers Index 
<https://airflow.apache.org/docs/#providers-packages-docs-apache-airflow-providers-index-html>`_.
 
-You can install those provider packages separately in order to interface with 
a given service. For those
-providers that have corresponding extras, the provider packages (latest 
version from PyPI) are installed
-automatically when Airflow is installed with the extra.
+You can also see index of all community provider's operators and hooks in
+:doc:`/operators-and-hooks-ref/index`
 
-Community maintained providers are released and versioned separately from the 
Airflow releases. We are
-following the `Semver <https://semver.org/>`_ versioning scheme for the 
packages. Some versions of the
-provider packages might depend on particular versions of Airflow, but the 
general approach we have is that
-unless there is a good reason, new version of providers should work with 
recent versions of Airflow 2.x.
-Details will vary per-provider and if there is a limitation for particular 
version of particular provider,
-constraining the Airflow version used, it will be included as limitation of 
dependencies in the provider
-package.
+Extending Airflow core functionality
+------------------------------------
 
-Some of the providers have cross-provider dependencies as well. Those are not 
required dependencies, they
-might simply enable certain features (for example transfer operators often 
create dependency between
-different providers. Again, the general approach here is that the providers 
are backwards compatible,
-including cross-dependencies. Any kind of breaking changes and requirements on 
particular versions of other
-provider packages are automatically documented in the release notes of every 
provider.
+Providers give you the capability of extending core Airflow with extra 
capabilities. The Core airflow
+provides basic and solid functionality of scheduling, the providers extend its 
capabilities. Here we
+describe all the custom capabilities.
 
-.. note::
-    We also provide ``apache-airflow-backport-providers`` packages that can be 
installed for Airflow 1.10.
-    Those are the same providers as for 2.0 but automatically back-ported to 
work for Airflow 1.10. The
-    last release of backport providers was done on March 17, 2021.
+Airflow automatically discovers which providers add those additional 
capabilities and, once you install
+provider package and re-start Airflow, those become automatically available to 
Airflow Users.
+
+The summary of the core functionalities that can be extended are available in
+:doc:`/core-extensions/index`.
+
+Auth backends
+'''''''''''''
+
+The providers can add custom authentication backends, that allow you to 
configure the way how your
+web server authenticates your users, integrating it with public or private 
authentication services.
+behaviour for the connections defined by the provider.
+
+You can see all the authentication backends available via community-managed 
providers in
+:doc:`/core-extensions/auth-backends`
+
+Custom connections
+''''''''''''''''''
+
+The providers can add custom connection types, extending connection form and 
handling custom form field
+behaviour for the connections defined by the provider.
 
-Creating and maintaining community providers
-""""""""""""""""""""""""""""""""""""""""""""
+You can see all task loggers available via community-managed providers in
+:doc:`/core-extensions/connections`.
 
-See :doc:`howto/create-update-providers` for more information.
+Extra links
+'''''''''''
 
+The providers can add extra custom links to operators delivered by the 
provider. Those will be visible in
+task details view of the task.
 
-Provider packages functionality
-'''''''''''''''''''''''''''''''
+You can see all the extra links available via community-managed providers in
+:doc:`/core-extensions/extra-links`.
+
+
+Logging
+'''''''
+
+The providers can add additional task logging capabilities. By default 
``Apache Airflow`` saves logs for
+tasks locally and make them available to Airflow UI via internal http server, 
however via providers
+you can add extra logging capabilities, where Airflow Logs can be written to a 
remote service and
+retrieved from those services.
+
+You can see all task loggers available via community-managed providers in
+:doc:`/core-extensions/logging`.
+
+
+Secret backends
+'''''''''''''''
+
+Airflow has the capability of reading connections, variables and configuration 
from Secret Backends rather
+than from its own Database.
+
+You can see all task loggers available via community-managed providers in
+:doc:`/core-extensions/secrets-backends`.
+
+
+Installing and upgrading providers
+----------------------------------
 
 Separate provider packages give the possibilities that were not available in 
1.10:
 
@@ -80,34 +117,63 @@ Separate provider packages give the possibilities that 
were not available in 1.1
    following the usual tests you have in your environment.
 
 
-Extending Airflow Connections and Extra links via Providers
-'''''''''''''''''''''''''''''''''''''''''''''''''''''''''''
+Types of providers
+------------------
 
-Providers can contain operators, hooks, sensor, and transfer operators to 
communicate with a
-multitude of external systems, but they can also extend Airflow core. Airflow 
has several extension
-capabilities that can be used by providers. Airflow automatically discovers 
which providers add those
-additional capabilities and, once you install provider package and re-start 
Airflow, those become
-automatically available to Airflow Users.
+Providers have the same capacity - no matter if they are provided by the 
community or if they are
+third-party providers. This chapter explains how community managed providers 
are versioned and released
+and how you can create your own providers.
+
+Community maintained providers
+''''''''''''''''''''''''''''''
+
+From the point of view of the community, Airflow is delivered in multiple, 
separate packages.
+The core of Airflow scheduling system is delivered as ``apache-airflow`` 
package and there are more than
+60 provider packages which can be installed separately as so called ``Airflow 
Provider packages``.
+Those packages are available as ``apache-airflow-providers`` packages - for 
example there is an
+``apache-airflow-providers-amazon`` or ``apache-airflow-providers-google`` 
package).
+
+Community maintained providers are released and versioned separately from the 
Airflow releases. We are
+following the `Semver <https://semver.org/>`_ versioning scheme for the 
packages. Some versions of the
+provider packages might depend on particular versions of Airflow, but the 
general approach we have is that
+unless there is a good reason, new version of providers should work with 
recent versions of Airflow 2.x.
+Details will vary per-provider and if there is a limitation for particular 
version of particular provider,
+constraining the Airflow version used, it will be included as limitation of 
dependencies in the provider
+package.
+
+Each community provider has corresponding extra which can be used when 
installing airflow to install the
+provider together with ``Apache Airflow`` - for example you can install 
airflow with those extras:
+``apache-airflow[google,amazon]`` (with correct constraints -see 
:doc:`apache-airflow:installation`) and you
+will install the appropriate versions of the 
``apache-airflow-providers-amazon`` and
+``apache-airflow-providers-google`` packages together with ``Apache Airflow``.
+
+Some of the community  providers have cross-provider dependencies as well. 
Those are not required
+dependencies, they might simply enable certain features (for example transfer 
operators often create
+dependency between different providers. Again, the general approach here is 
that the providers are backwards
+compatible, including cross-dependencies. Any kind of breaking changes and 
requirements on particular versions of other
+provider packages are automatically documented in the release notes of every 
provider.
 
-The capabilities are:
+.. note::
+    For Airflow 1.10 We also provided ``apache-airflow-backport-providers`` 
packages that could be installed
+    with those versions Those were the same providers as for 2.0 but 
automatically back-ported to work for
+    Airflow 1.10. The last release of backport providers was done on March 17, 
2021 and the backport
+    providers will no longer be released, since Airflow 1.10 has reached 
End-Of-Life as of June 17, 2021.
 
-* Adding Extra Links to operators delivered by the provider. See 
:doc:`apache-airflow:howto/define_extra_link`
-  for a description of what extra links are and examples of provider 
registering an operator with extra links
+If you want to contribute to ``Apache Airflow``, you can see how to build and 
extend community
+managed providers in :doc:`howto/create-update-providers`.
 
-* Adding custom connection types, extending connection form and handling 
custom form field behaviour for the
-  connections defined by the provider. See 
:doc:`apache-airflow:howto/connection` for a description of
-  connection and what capabilities of custom connection you can define.
+.. _providers:community-maintained-providers:
 
 Custom provider packages
 ''''''''''''''''''''''''
 
-However, there is more. You can develop your own providers. This is a bit 
involved, but your custom operators,
-hooks, sensors, transfer operators can be packaged together in a standard 
airflow package and installed
-using the same mechanisms. Moreover they can also use the same mechanisms to 
extend the Airflow Core with
-custom connections and extra operator links as described in the previous 
chapter.
+You can develop and release your own providers. Your custom operators, hooks, 
sensors, transfer operators
+can be packaged together in a standard airflow package and installed using the 
same mechanisms.
+Moreover they can also use the same mechanisms to extend the Airflow Core with 
auth backends,
+custom connections, logging, secret backends and extra operator links as 
described in the previous chapter.
 
 How to create your own provider
-'''''''''''''''''''''''''''''''
+-------------------------------
 
 As mentioned in the `Providers 
<http://airflow.apache.org/docs/apache-airflow-providers/index.html>`_
 documentation, custom providers can extend Airflow core - they can add extra 
links to operators as well
@@ -166,10 +232,10 @@ When you write your own provider, consider following the
 
 
 FAQ for Airflow and Providers
-'''''''''''''''''''''''''''''
+-----------------------------
 
 Upgrading Airflow 2.0 and Providers
-"""""""""""""""""""""""""""""""""""
+'''''''''''''''''''''''''''''''''''
 
 **When upgrading to a new Airflow version such as 2.0, but possibly 2.0.1 and 
beyond, is the best practice
 to also upgrade provider packages at the same time?**
@@ -181,24 +247,8 @@ you can either upgrade all used provider packages first, 
and then upgrade Airflo
 round. The first approach - when you first upgrade all providers is probably 
safer, as you can do it
 incrementally, step-by-step replacing provider by provider in your environment.
 
-Using Backport Providers in Airflow 1.10
-""""""""""""""""""""""""""""""""""""""""
-
-**I have an Airflow version (1.10.12) running and it is stable. However, 
because of a Cloud provider change,
-I would like to upgrade the provider package. If I don't need to upgrade the 
Airflow version anymore,
-how do I know that this provider version is compatible with my Airflow 
version?**
-
-We have Backport Providers are compatible with 1.10 but they stopped being 
released on
-March 17, 2021. Since then, no new changes to providers for Airflow 2.0 are 
going to be
-released as backport packages. It's the highest time to upgrade to Airflow 2.0.
-
-When it comes to compatibility of providers with different Airflow 2 versions, 
each
-provider package will keep its own dependencies, and while we expect those 
providers to be generally
-backwards-compatible, particular versions of particular providers might 
introduce dependencies on
-specific Airflow versions.
-
 Customizing Provider Packages
-"""""""""""""""""""""""""""""
+'''''''''''''''''''''''''''''
 
 **I have an older version of my provider package which we have lightly 
customized and is working
 fine with my MSSQL installation. I am upgrading my Airflow version. Do I need 
to upgrade my provider,
@@ -212,7 +262,7 @@ as you have not used internal Airflow classes) should work 
for All Airflow 2.* v
 
 
 Creating your own providers
-"""""""""""""""""""""""""""
+'''''''''''''''''''''''''''
 
 **When I write my own provider, do I need to do anything special to make it 
available to others?**
 
@@ -323,7 +373,6 @@ After you think that your provider matches the expected 
values above,  you can r
 :doc:`howto/create-update-providers` to check all prerequisites for a new
 community Provider and discuss it at the `Devlist 
<http://airflow.apache.org/community/>`_.
 
-
 However, in case you have your own, specific provider, which you can maintain 
on your own or by your
 team, you are free to publish the providers in whatever form you find 
appropriate. The custom and
 community-managed providers have exactly the same capabilities.
@@ -342,13 +391,30 @@ commercial-friendly and there are many businesses built 
around Apache Airflow an
 Apache projects. As a community, we provide all the software for free and this 
will never
 change. What 3rd-party developers are doing is not under control of Apache 
Airflow community.
 
+Using Backport Providers in Airflow 1.10
+''''''''''''''''''''''''''''''''''''''''
+
+**I have an Airflow version (1.10.12) running and it is stable. However, 
because of a Cloud provider change,
+I would like to upgrade the provider package. If I don't need to upgrade the 
Airflow version anymore,
+how do I know that this provider version is compatible with my Airflow 
version?**
+
+We have Backport Providers are compatible with 1.10 but they stopped being 
released on
+March 17, 2021. Since then, no new changes to providers for Airflow 2.0 are 
going to be
+released as backport packages. It's the highest time to upgrade to Airflow 2.0.
+
+When it comes to compatibility of providers with different Airflow 2 versions, 
each
+provider package will keep its own dependencies, and while we expect those 
providers to be generally
+backwards-compatible, particular versions of particular providers might 
introduce dependencies on
+specific Airflow versions.
 
-Content
--------
+Contents
+--------
 
 .. toctree::
-    :maxdepth: 1
+    :maxdepth: 2
 
+    Providers <self>
     Packages <packages-ref>
     Operators and hooks <operators-and-hooks-ref/index>
-    Howto create and update community providers <howto/create-update-providers>
+    Core Extensions <core-extensions/index>
+    Update community providers <howto/create-update-providers>
diff --git a/docs/apache-airflow/concepts/connections.rst 
b/docs/apache-airflow/concepts/connections.rst
index e878a8b..c1fdabb 100644
--- a/docs/apache-airflow/concepts/connections.rst
+++ b/docs/apache-airflow/concepts/connections.rst
@@ -37,3 +37,13 @@ A Hook is a high-level interface to an external platform 
that lets you quickly a
 They integrate with Connections to gather credentials, and many have a default 
``conn_id``; for example, the 
:class:`~airflow.providers.postgres.hooks.postgres.PostgresHook` automatically 
looks for the Connection with a ``conn_id`` of ``postgres_default`` if you 
don't pass one in.
 
 You can view a :ref:`full list of airflow hooks <pythonapi:hooks>` in our API 
documentation.
+
+Custom connections
+------------------
+
+Airflow allows to define custom connection types. This is what is described in 
detail in
+:doc:`apache-airflow-providers:index` - providers give you the capability of 
defining your own connections.
+The connection customization can be done by any provider, but also
+many of the providers managed by the community define custom connection types.
+The full list of all providers delivered by ``Apache Airflow community managed 
providers`` can be found in
+:doc:`apache-airflow-providers:core-extensions/connections`.
diff --git a/docs/apache-airflow/concepts/operators.rst 
b/docs/apache-airflow/concepts/operators.rst
index 62c4e27..2514635 100644
--- a/docs/apache-airflow/concepts/operators.rst
+++ b/docs/apache-airflow/concepts/operators.rst
@@ -48,11 +48,16 @@ If the operator you need isn't installed with Airflow by 
default, you can probab
 - 
:class:`~airflow.providers.mysql.transfers.presto_to_mysql.PrestoToMySqlOperator`
 - :class:`~airflow.providers.slack.operators.slack.SlackAPIOperator`
 
-But there are many, many more - you can see the list of those in our 
:doc:`providers packages 
<apache-airflow-providers:operators-and-hooks-ref/index>` documentation.
+But there are many, many more - you can see the full list of all 
community-managed operators, hooks, sensors
+and transfers in our
+:doc:`providers packages 
<apache-airflow-providers:operators-and-hooks-ref/index>` documentation.
 
 .. note::
 
-    Inside Airflow's code, we often mix the concepts of :doc:`tasks` and 
Operators, and they are mostly interchangeable. However, when we talk about a 
*Task*, we mean the generic "unit of execution" of a DAG; when we talk about an 
*Operator*, we mean a reusable, pre-made Task template whose logic is all done 
for you and that just needs some arguments.
+    Inside Airflow's code, we often mix the concepts of :doc:`tasks` and 
Operators, and they are mostly
+    interchangeable. However, when we talk about a *Task*, we mean the generic 
"unit of execution" of a
+    DAG; when we talk about an *Operator*, we mean a reusable, pre-made Task 
template whose logic
+    is all done for you and that just needs some arguments.
 
 
 .. _concepts:jinja-templating:
diff --git a/docs/apache-airflow/howto/define_extra_link.rst 
b/docs/apache-airflow/howto/define_extra_link.rst
index 2a89323..223cd42 100644
--- a/docs/apache-airflow/howto/define_extra_link.rst
+++ b/docs/apache-airflow/howto/define_extra_link.rst
@@ -21,9 +21,7 @@
 Define an operator extra link
 =============================
 
-For each operator, you can define its own extra links that can
-redirect users to external systems. The extra link buttons
-will be available on the task page:
+
 
 .. image:: ../img/operator_extra_link.png
 
@@ -66,6 +64,9 @@ You can also add a global operator extra link that will be 
available to
 all the operators through an airflow plugin or through airflow providers. You 
can learn more about it in the
 :ref:`plugin example <plugin-example>` and in 
:doc:`apache-airflow-providers:index`.
 
+You can see all the extra links available via community-managed providers in
+:doc:`apache-airflow-providers:core-extensions/extra-links`.
+
 
 Add or override Links to Existing Operators
 -------------------------------------------
diff --git a/docs/apache-airflow/logging-monitoring/logging-tasks.rst 
b/docs/apache-airflow/logging-monitoring/logging-tasks.rst
index ece0402..c1b8636 100644
--- a/docs/apache-airflow/logging-monitoring/logging-tasks.rst
+++ b/docs/apache-airflow/logging-monitoring/logging-tasks.rst
@@ -20,7 +20,15 @@
 Logging for Tasks
 =================
 
-Writing Logs Locally
+Airflow writes logs for tasks in a way that allows to see the logs for each 
task separately via Airflow UI.
+The Core Airflow implements writing and serving logs locally. However you can 
also write logs to remote
+services - via community providers, but you can also write your own loggers.
+
+Below we describe the local task logging, but Apache Airflow Community also 
releases providers for many
+services (:doc:`apache-airflow-providers:index`) and some of them also provide 
handlers that extend logging
+capability of Apache Airflow. You can see all those providers in 
:doc:`apache-airflow-providers:core-extensions/logging`.
+
+Writing logs Locally
 --------------------
 
 Users can specify the directory to place log files in ``airflow.cfg`` using
@@ -39,6 +47,7 @@ can not be found or accessed, local logs will be displayed. 
Note that logs
 are only sent to remote storage once a task is complete (including failure); 
In other words, remote logs for
 running tasks are unavailable (but local logs are available).
 
+
 Troubleshooting
 ---------------
 
diff --git a/docs/apache-airflow/operators-and-hooks-ref.rst 
b/docs/apache-airflow/operators-and-hooks-ref.rst
index 91ad4d5..3a27439 100644
--- a/docs/apache-airflow/operators-and-hooks-ref.rst
+++ b/docs/apache-airflow/operators-and-hooks-ref.rst
@@ -19,8 +19,11 @@ Operators and Hooks Reference
 =============================
 
 Here's the list of the operators and hooks which are available in this release 
in the ``apache-airflow`` package.
-Airflow has many more integrations available for separate installation as a 
provider packages. For details see:
-:doc:`apache-airflow-providers:operators-and-hooks-ref/index`.
+
+Airflow has many more integrations available for separate installation as
+:doc:`apache-airflow-providers:index`.
+
+For details see: :doc:`apache-airflow-providers:operators-and-hooks-ref/index`.
 
 **Base:**
 
diff --git a/docs/apache-airflow/security/secrets/secrets-backend/index.rst 
b/docs/apache-airflow/security/secrets/secrets-backend/index.rst
index 272a2a5..a70556d 100644
--- a/docs/apache-airflow/security/secrets/secrets-backend/index.rst
+++ b/docs/apache-airflow/security/secrets/secrets-backend/index.rst
@@ -20,10 +20,10 @@ Secrets Backend
 
 .. versionadded:: 1.10.10
 
-In addition to retrieving connections & variables from environment variables 
or the metastore database, you can enable
-an alternative secrets backend to retrieve Airflow connections or Airflow 
variables,
-such as :ref:`Google Cloud Secret 
Manager<google_cloud_secret_manager_backend>`,
-:ref:`Hashicorp Vault Secrets<hashicorp_vault_secrets>` or you can :ref:`roll 
your own <roll_your_own_secrets_backend>`.
+In addition to retrieving connections & variables from environment variables 
or the metastore database, you
+can also enable alternative secrets backend to retrieve Airflow connections or 
Airflow variables via
+:ref:`Apache Airflow Community provided backends <community_secret_backends>` 
in
+:doc:`apache-airflow-providers:core-extensions/secrets-backends`.
 
 .. note::
 
@@ -67,8 +67,8 @@ the example below.
     $ airflow config get-value secrets backend
     
airflow.providers.google.cloud.secrets.secret_manager.CloudSecretManagerBackend
 
-Supported backends
-^^^^^^^^^^^^^^^^^^
+Supported core backends
+^^^^^^^^^^^^^^^^^^^^^^^
 
 .. toctree::
     :maxdepth: 1
@@ -76,6 +76,17 @@ Supported backends
 
     *
 
+.. _community_secret_backends:
+
+Apache Airflow Community provided secret backends
+^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
+
+Apache Airflow Community also releases community developed providers 
(:doc:`apache-airflow-providers:index`)
+and some of them also provide handlers that extend secret backends
+capability of Apache Airflow. You can see all those providers in
+:doc:`apache-airflow-providers:core-extensions/secrets-backends`.
+
+
 .. _roll_your_own_secrets_backend:
 
 Roll your own secrets backend
diff --git a/docs/build_docs.py b/docs/build_docs.py
index a474c83..0f52211 100755
--- a/docs/build_docs.py
+++ b/docs/build_docs.py
@@ -486,6 +486,61 @@ def main():
         all_spelling_errors.update(package_spelling_errors)
 
     # Build documentation for some packages again if it can help them.
+    package_build_errors, package_spelling_errors = 
retry_building_docs_if_needed(
+        all_build_errors,
+        all_spelling_errors,
+        args,
+        docs_only,
+        for_production,
+        jobs,
+        package_build_errors,
+        package_spelling_errors,
+        spellcheck_only,
+    )
+
+    # And try again in case one change spans across three-level dependencies
+    retry_building_docs_if_needed(
+        all_build_errors,
+        all_spelling_errors,
+        args,
+        docs_only,
+        for_production,
+        jobs,
+        package_build_errors,
+        package_spelling_errors,
+        spellcheck_only,
+    )
+
+    if not disable_checks:
+        general_errors = lint_checks.run_all_check()
+        if general_errors:
+            all_build_errors[None] = general_errors
+
+    dev_index_generator.generate_index(f"{DOCS_DIR}/_build/index.html")
+
+    if not package_filters:
+        _promote_new_flags()
+
+    if os.path.exists(PROVIDER_INIT_FILE):
+        os.remove(PROVIDER_INIT_FILE)
+
+    print_build_errors_and_exit(
+        all_build_errors,
+        all_spelling_errors,
+    )
+
+
+def retry_building_docs_if_needed(
+    all_build_errors,
+    all_spelling_errors,
+    args,
+    docs_only,
+    for_production,
+    jobs,
+    package_build_errors,
+    package_spelling_errors,
+    spellcheck_only,
+):
     to_retry_packages = [
         package_name
         for package_name, errors in package_build_errors.items()
@@ -510,24 +565,8 @@ def main():
             all_build_errors.update(package_build_errors)
         if package_spelling_errors:
             all_spelling_errors.update(package_spelling_errors)
-
-    if not disable_checks:
-        general_errors = lint_checks.run_all_check()
-        if general_errors:
-            all_build_errors[None] = general_errors
-
-    dev_index_generator.generate_index(f"{DOCS_DIR}/_build/index.html")
-
-    if not package_filters:
-        _promote_new_flags()
-
-    if os.path.exists(PROVIDER_INIT_FILE):
-        os.remove(PROVIDER_INIT_FILE)
-
-    print_build_errors_and_exit(
-        all_build_errors,
-        all_spelling_errors,
-    )
+        return package_build_errors, package_spelling_errors
+    return package_build_errors, package_spelling_errors
 
 
 if __name__ == "__main__":
diff --git a/docs/exts/auth_backend.rst.jinja2 
b/docs/exts/auth_backend.rst.jinja2
new file mode 100644
index 0000000..d14b115
--- /dev/null
+++ b/docs/exts/auth_backend.rst.jinja2
@@ -0,0 +1,27 @@
+{#
+ Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements.  See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership.  The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License.  You may obtain a copy of the License at
+
+   http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied.  See the License for the
+ specific language governing permissions and limitations
+ under the License.
+#}
+{%for provider, provider_dict in items.items() %}
+{{ provider_dict['name'] }}
+{{ header_separator * (provider_dict['name']|length) }}
+
+{% for backend in provider_dict['auth_backends'] -%}
+- :class:`~{{ backend }}`
+{% endfor -%}
+
+{% endfor %}
diff --git a/docs/exts/connections.rst.jinja2 b/docs/exts/connections.rst.jinja2
new file mode 100644
index 0000000..1b4f63d
--- /dev/null
+++ b/docs/exts/connections.rst.jinja2
@@ -0,0 +1,27 @@
+{#
+ Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements.  See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership.  The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License.  You may obtain a copy of the License at
+
+   http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied.  See the License for the
+ specific language governing permissions and limitations
+ under the License.
+#}
+{%for provider, provider_dict in items.items() %}
+{{ provider_dict['name'] }}
+{{ header_separator * (provider_dict['name']|length) }}
+
+{% for backend in provider_dict['connection_types'] -%}
+- `{{ backend['connection-type'] }}`: :class:`~{{ backend['hook-class-name'] 
}}`
+{% endfor -%}
+
+{% endfor %}
diff --git a/docs/exts/extra_links.rst.jinja2 b/docs/exts/extra_links.rst.jinja2
new file mode 100644
index 0000000..be085fd
--- /dev/null
+++ b/docs/exts/extra_links.rst.jinja2
@@ -0,0 +1,27 @@
+{#
+ Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements.  See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership.  The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License.  You may obtain a copy of the License at
+
+   http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied.  See the License for the
+ specific language governing permissions and limitations
+ under the License.
+#}
+{%for provider, provider_dict in items.items() %}
+{{ provider_dict['name'] }}
+{{ header_separator * (provider_dict['name']|length) }}
+
+{% for extra_link in provider_dict['extra_links'] -%}
+    - :class:`~{{ extra_link }}`
+{% endfor -%}
+
+{% endfor %}
diff --git a/docs/exts/logging.rst.jinja2 b/docs/exts/logging.rst.jinja2
new file mode 100644
index 0000000..0ed076c
--- /dev/null
+++ b/docs/exts/logging.rst.jinja2
@@ -0,0 +1,29 @@
+{#
+ Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements.  See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership.  The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License.  You may obtain a copy of the License at
+
+   http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied.  See the License for the
+ specific language governing permissions and limitations
+ under the License.
+#}
+{%for provider, provider_dict in items.items() %}
+{{ provider_dict['name'] }}
+{{ header_separator * (provider_dict['name']|length) }}
+
+:doc:`{{ provider }}:logging/index`
+
+{% for handler in provider_dict['handlers'] -%}
+- :class:`~{{ handler }}`
+{% endfor -%}
+
+{% endfor %}
diff --git a/docs/exts/operators_and_hooks_ref.py 
b/docs/exts/operators_and_hooks_ref.py
index 91824b3..a0d5bc9 100644
--- a/docs/exts/operators_and_hooks_ref.py
+++ b/docs/exts/operators_and_hooks_ref.py
@@ -17,8 +17,9 @@
 
 import os
 from functools import lru_cache
-from typing import Optional, Set
+from typing import Iterable, Optional, Set
 
+import click
 import jinja2
 from docutils import nodes
 from docutils.nodes import Element
@@ -88,7 +89,7 @@ def _prepare_operators_data(tags: Optional[Set[str]]):
     package_data = load_package_data()
     all_integrations = _prepare_resource_index(package_data, "integrations")
     if tags is None:
-        to_display_integration = all_integrations
+        to_display_integration = all_integrations.values()
     else:
         to_display_integration = [
             integration for integration in all_integrations.values() if 
tags.intersection(integration["tags"])
@@ -121,7 +122,7 @@ def _prepare_operators_data(tags: Optional[Set[str]]):
     return sorted(results, key=lambda d: 
d["integration"]["integration-name"].lower())
 
 
-def _render_operator_content(*, tags: Optional[Set[str]], header_separator: 
str = DEFAULT_HEADER_SEPARATOR):
+def _render_operator_content(*, tags: Optional[Set[str]], header_separator: 
str):
     tabular_data = _prepare_operators_data(tags)
 
     return _render_template(
@@ -162,7 +163,7 @@ def _prepare_transfer_data(tags: Optional[Set[str]]):
     return to_display_transfers
 
 
-def _render_transfer_content(*, tags: Optional[Set[str]], header_separator: 
str = DEFAULT_HEADER_SEPARATOR):
+def _render_transfer_content(*, tags: Optional[Set[str]], header_separator: 
str):
     tabular_data = _prepare_transfer_data(tags)
 
     return _render_template(
@@ -170,6 +171,102 @@ def _render_transfer_content(*, tags: Optional[Set[str]], 
header_separator: str
     )
 
 
+def _prepare_logging_data():
+    package_data = load_package_data()
+    all_logging = {}
+    for provider in package_data:
+        logging_handlers = provider.get("logging")
+        if logging_handlers:
+            package_name = provider['package-name']
+            all_logging[package_name] = {'name': provider['name'], 'handlers': 
logging_handlers}
+    return all_logging
+
+
+def _render_logging_content(*, header_separator: str):
+    tabular_data = _prepare_logging_data()
+
+    return _render_template("logging.rst.jinja2", items=tabular_data, 
header_separator=header_separator)
+
+
+def _prepare_auth_backend_data():
+    package_data = load_package_data()
+    all_auth_backends = {}
+    for provider in package_data:
+        auth_backends_list = provider.get("auth-backends")
+        if auth_backends_list:
+            package_name = provider['package-name']
+            all_auth_backends[package_name] = {'name': provider['name'], 
'auth_backends': auth_backends_list}
+    return all_auth_backends
+
+
+def _render_auth_backend_content(*, header_separator: str):
+    tabular_data = _prepare_auth_backend_data()
+
+    return _render_template("auth_backend.rst.jinja2", items=tabular_data, 
header_separator=header_separator)
+
+
+def _prepare_secrets_backend_data():
+    package_data = load_package_data()
+    all_secret_backends = {}
+    for provider in package_data:
+        secret_backends_list = provider.get("secrets-backends")
+        if secret_backends_list:
+            package_name = provider['package-name']
+            all_secret_backends[package_name] = {
+                'name': provider['name'],
+                'secrets_backends': secret_backends_list,
+            }
+    return all_secret_backends
+
+
+def _render_secrets_backend_content(*, header_separator: str):
+    tabular_data = _prepare_secrets_backend_data()
+
+    return _render_template(
+        "secret_backend.rst.jinja2", items=tabular_data, 
header_separator=header_separator
+    )
+
+
+def _prepare_connections_data():
+    package_data = load_package_data()
+    all_connections = {}
+    for provider in package_data:
+        connections_list = provider.get("connection-types")
+        if connections_list:
+            package_name = provider['package-name']
+            all_connections[package_name] = {
+                'name': provider['name'],
+                'connection_types': connections_list,
+            }
+    return all_connections
+
+
+def _render_connections_content(*, header_separator: str):
+    tabular_data = _prepare_connections_data()
+
+    return _render_template("connections.rst.jinja2", items=tabular_data, 
header_separator=header_separator)
+
+
+def _prepare_extra_links_data():
+    package_data = load_package_data()
+    all_extra_links = {}
+    for provider in package_data:
+        extra_link_list = provider.get("extra-links")
+        if extra_link_list:
+            package_name = provider['package-name']
+            all_extra_links[package_name] = {
+                'name': provider['name'],
+                'extra_links': extra_link_list,
+            }
+    return all_extra_links
+
+
+def _render_extra_links_content(*, header_separator: str):
+    tabular_data = _prepare_extra_links_data()
+
+    return _render_template("extra_links.rst.jinja2", items=tabular_data, 
header_separator=header_separator)
+
+
 class BaseJinjaReferenceDirective(Directive):
     """The base directive for OperatorsHooksReferenceDirective and 
TransfersReferenceDirective"""
 
@@ -222,43 +319,130 @@ class 
TransfersReferenceDirective(BaseJinjaReferenceDirective):
         )
 
 
+class LoggingDirective(BaseJinjaReferenceDirective):
+    """Generate list of logging handlers"""
+
+    def render_content(self, *, tags: Optional[Set[str]], header_separator: 
str = DEFAULT_HEADER_SEPARATOR):
+        return _render_logging_content(
+            header_separator=header_separator,
+        )
+
+
+class AuthBackendDirective(BaseJinjaReferenceDirective):
+    """Generate list of auth backend handlers"""
+
+    def render_content(self, *, tags: Optional[Set[str]], header_separator: 
str = DEFAULT_HEADER_SEPARATOR):
+        return _render_auth_backend_content(
+            header_separator=header_separator,
+        )
+
+
+class SecretsBackendDirective(BaseJinjaReferenceDirective):
+    """Generate list of secret backend handlers"""
+
+    def render_content(self, *, tags: Optional[Set[str]], header_separator: 
str = DEFAULT_HEADER_SEPARATOR):
+        return _render_secrets_backend_content(
+            header_separator=header_separator,
+        )
+
+
+class ConnectionsDirective(BaseJinjaReferenceDirective):
+    """Generate list of connections"""
+
+    def render_content(self, *, tags: Optional[Set[str]], header_separator: 
str = DEFAULT_HEADER_SEPARATOR):
+        return _render_connections_content(
+            header_separator=header_separator,
+        )
+
+
+class ExtraLinksDirective(BaseJinjaReferenceDirective):
+    """Generate list of extra links"""
+
+    def render_content(self, *, tags: Optional[Set[str]], header_separator: 
str = DEFAULT_HEADER_SEPARATOR):
+        return _render_extra_links_content(
+            header_separator=header_separator,
+        )
+
+
 def setup(app):
     """Setup plugin"""
     app.add_directive('operators-hooks-ref', OperatorsHooksReferenceDirective)
     app.add_directive('transfers-ref', TransfersReferenceDirective)
+    app.add_directive('airflow-logging', LoggingDirective)
+    app.add_directive('airflow-auth-backends', AuthBackendDirective)
+    app.add_directive('airflow-secrets-backends', SecretsBackendDirective)
+    app.add_directive('airflow-connections', ConnectionsDirective)
+    app.add_directive('airflow-extra-links', ExtraLinksDirective)
 
     return {'parallel_read_safe': True, 'parallel_write_safe': True}
 
 
-if __name__ == "__main__":
-    import argparse
-
-    parser = argparse.ArgumentParser(description='Render tables with 
integrations.')
-    parser.add_argument(
-        '--tag',
-        dest='tags',
-        action="append",
-        help='If passed, displays integrations that have a matching tag.',
-    )
-    parser.add_argument('--header-separator', default=DEFAULT_HEADER_SEPARATOR)
-    subparsers = parser.add_subparsers(help='sub-command help', 
metavar="COMMAND")
-    subparsers.required = True
+option_tag = click.option(
+    '--tag',
+    multiple=True,
+    help="If passed, displays integrations that have a matching tag",
+)
 
-    parser_a = subparsers.add_parser(CMD_OPERATORS_AND_HOOKS)
-    parser_a.set_defaults(cmd=CMD_OPERATORS_AND_HOOKS)
+option_header_separator = click.option(
+    '--header-separator', default=DEFAULT_HEADER_SEPARATOR, show_default=True
+)
 
-    parser_b = subparsers.add_parser(CMD_TRANSFERS)
-    parser_b.set_defaults(cmd=CMD_TRANSFERS)
 
-    args = parser.parse_args()
+@click.group(context_settings={'help_option_names': ['-h', '--help'], 
'max_content_width': 500})
+def cli():
+    """Render tables with integrations"""
 
-    if args.cmd == CMD_OPERATORS_AND_HOOKS:
-        content = _render_operator_content(
-            tags=set(args.tags) if args.tags else None, 
header_separator=args.header_separator
-        )
-    else:
-        content = _render_transfer_content(
-            tags=set(args.tags) if args.tags else None, 
header_separator=args.header_separator
-        )
 
-    print(content)
+@cli.command()
+@option_tag
+@option_header_separator
+def operators_and_hooks(tag: Iterable[str], header_separator: str):
+    """Renders Operators ahd Hooks content"""
+    print(_render_operator_content(tags=set(tag) if tag else None, 
header_separator=header_separator))
+
+
+@cli.command()
+@option_tag
+@option_header_separator
+def transfers(tag: Iterable[str], header_separator: str):
+    """Renders Transfers content"""
+    print(_render_transfer_content(tags=set(tag) if tag else None, 
header_separator=header_separator))
+
+
+@cli.command()
+@option_header_separator
+def logging(header_separator: str):
+    """Renders Logger content"""
+    print(_render_logging_content(header_separator=header_separator))
+
+
+@cli.command()
+@option_header_separator
+def auth_backends(header_separator: str):
+    """Renders Logger content"""
+    print(_render_auth_backend_content(header_separator=header_separator))
+
+
+@cli.command()
+@option_header_separator
+def secret_backends(header_separator: str):
+    """Renders Secret Backends content"""
+    print(_render_secrets_backend_content(header_separator=header_separator))
+
+
+@cli.command()
+@option_header_separator
+def connections(header_separator: str):
+    """Renders Connections content"""
+    print(_render_connections_content(header_separator=header_separator))
+
+
+@cli.command()
+@option_header_separator
+def extra_links(header_separator: str):
+    """Renders Extra  links content"""
+    print(_render_extra_links_content(header_separator=header_separator))
+
+
+if __name__ == "__main__":
+    cli()
diff --git a/docs/exts/secret_backend.rst.jinja2 
b/docs/exts/secret_backend.rst.jinja2
new file mode 100644
index 0000000..52646c2
--- /dev/null
+++ b/docs/exts/secret_backend.rst.jinja2
@@ -0,0 +1,27 @@
+{#
+ Licensed to the Apache Software Foundation (ASF) under one
+ or more contributor license agreements.  See the NOTICE file
+ distributed with this work for additional information
+ regarding copyright ownership.  The ASF licenses this file
+ to you under the Apache License, Version 2.0 (the
+ "License"); you may not use this file except in compliance
+ with the License.  You may obtain a copy of the License at
+
+   http://www.apache.org/licenses/LICENSE-2.0
+
+ Unless required by applicable law or agreed to in writing,
+ software distributed under the License is distributed on an
+ "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ KIND, either express or implied.  See the License for the
+ specific language governing permissions and limitations
+ under the License.
+#}
+{%for provider, provider_dict in items.items() %}
+{{ provider_dict['name'] }}
+{{ header_separator * (provider_dict['name']|length) }}
+
+{% for backend in provider_dict['secrets_backends'] -%}
+- :class:`~{{ backend }}`
+{% endfor -%}
+
+{% endfor %}
diff --git a/docs/helm-chart/manage-logs.rst b/docs/helm-chart/manage-logs.rst
index e23fe06..5fd06a3 100644
--- a/docs/helm-chart/manage-logs.rst
+++ b/docs/helm-chart/manage-logs.rst
@@ -82,7 +82,7 @@ Elasticsearch
 -------------
 
 If your cluster forwards logs to Elasticsearch, you can configure Airflow to 
retrieve task logs from it.
-See the :doc:`Elasticsearch providers guide 
<apache-airflow-providers-elasticsearch:logging>` for more details.
+See the :doc:`Elasticsearch providers guide 
<apache-airflow-providers-elasticsearch:logging/index>` for more details.
 
 .. code-block:: bash
 
diff --git a/setup.py b/setup.py
index 86f89fb..9a4c866 100644
--- a/setup.py
+++ b/setup.py
@@ -244,6 +244,7 @@ deprecated_api = [
     'requests>=2.26.0',
 ]
 doc = [
+    'click>=7.1,<9',
     # Sphinx is limited to < 3.5.0 because of 
https://github.com/sphinx-doc/sphinx/issues/8880
     'sphinx>=2.1.2, <3.5.0',
     'sphinx-airflow-theme',

Reply via email to