This is an automated email from the ASF dual-hosted git repository.
uranusjr pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git
The following commit(s) were added to refs/heads/main by this push:
new d067cfe Fix "Top level Python Code" links in best practices doc
(#19913)
d067cfe is described below
commit d067cfea701b7077bfa6c0f2b5339848579ee184
Author: Jed Cunningham <[email protected]>
AuthorDate: Wed Dec 1 00:45:53 2021 -0700
Fix "Top level Python Code" links in best practices doc (#19913)
---
docs/apache-airflow/best-practices.rst | 4 ++--
1 file changed, 2 insertions(+), 2 deletions(-)
diff --git a/docs/apache-airflow/best-practices.rst
b/docs/apache-airflow/best-practices.rst
index 7a90f23..951e6b4 100644
--- a/docs/apache-airflow/best-practices.rst
+++ b/docs/apache-airflow/best-practices.rst
@@ -260,7 +260,7 @@ Dynamic DAGs with external configuration from a structured
data file
If you need to use a more complex meta-data to prepare your DAG structure and
you would prefer to keep the
data in a structured non-python format, you should export the data to the DAG
folder in a file and push
it to the DAG folder, rather than try to pull the data by the DAG's top-level
code - for the reasons
-explained in the parent `Top level Python code <_top-level-python-code>`_.
+explained in the parent :ref:`best_practices/top_level_code`.
The meta-data should be exported and stored together with the DAGs in a
convenient file format (JSON, YAML
formats are good candidates) in DAG folder. Ideally, the meta-data should be
published in the same
@@ -282,7 +282,7 @@ the meta-data file in your DAG easily. The location of the
file to read can be f
Airflow Variables
-----------------
-As mentioned in the previous chapter, `Top level Python code
<_top-level-python-code>`_. you should avoid
+As mentioned in the previous chapter, :ref:`best_practices/top_level_code`.
you should avoid
using Airflow Variables at top level Python code of DAGs. You can use the
Airflow Variables freely inside the
``execute()`` methods of the operators, but you can also pass the Airflow
Variables to the existing operators
via Jinja template, which will delay reading the value until the task
execution.