Lee-W commented on code in PR #53183:
URL: https://github.com/apache/airflow/pull/53183#discussion_r2207296454
##########
airflow-core/docs/administration-and-deployment/dagfile-processing.rst:
##########
@@ -16,43 +16,43 @@
specific language governing permissions and limitations
under the License.
-DAG File Processing
+Dag File Processing
-------------------
-DAG File Processing refers to the process of reading the python files that
define your dags and storing them such that the scheduler can schedule them.
+Dag File Processing refers to the process of reading the python files that
define your dags and storing them such that the scheduler can schedule them.
-There are two primary components involved in DAG file processing. The
``DagFileProcessorManager`` is a process executing an infinite loop that
determines which files need
-to be processed, and the ``DagFileProcessorProcess`` is a separate process
that is started to convert an individual file into one or more DAG objects.
+There are two primary components involved in dag file processing. The
``DagFileProcessorManager`` is a process executing an infinite loop that
determines which files need
+to be processed, and the ``DagFileProcessorProcess`` is a separate process
that is started to convert an individual file into one or more dag objects.
The ``DagFileProcessorManager`` runs user codes. As a result, it runs as a
standalone process by running the ``airflow dag-processor`` CLI command.
.. image:: /img/dag_file_processing_diagram.png
``DagFileProcessorManager`` has the following steps:
-1. Check for new files: If the elapsed time since the DAG was last refreshed
is > :ref:`config:scheduler__dag_dir_list_interval` then update the file paths
list
+1. Check for new files: If the elapsed time since the dag was last refreshed
is > :ref:`config:dag_processor__refresh_interval` then update the file paths
list
2. Exclude recently processed files: Exclude files that have been processed
more recently than
:ref:`min_file_process_interval<config:dag_processor__min_file_process_interval>`
and have not been modified
3. Queue file paths: Add files discovered to the file path queue
4. Process files: Start a new ``DagFileProcessorProcess`` for each file, up
to a maximum of :ref:`config:dag_processor__parsing_processes`
-5. Collect results: Collect the result from any finished DAG processors
+5. Collect results: Collect the result from any finished dag processors
6. Log statistics: Print statistics and emit
``dag_processing.total_parse_time``
``DagFileProcessorProcess`` has the following steps:
1. Process file: The entire process must complete within
:ref:`dag_file_processor_timeout<config:dag_processor__dag_file_processor_timeout>`
-2. The DAG files are loaded as Python module: Must complete within
:ref:`dagbag_import_timeout<config:core__dagbag_import_timeout>`
-3. Process modules: Find DAG objects within Python module
-4. Return DagBag: Provide the ``DagFileProcessorManager`` a list of the
discovered DAG objects
+2. The dag files are loaded as Python module: Must complete within
:ref:`dagbag_import_timeout<config:core__dagbag_import_timeout>`
+3. Process modules: Find dag objects within Python module
+4. Return DagBag: Provide the ``DagFileProcessorManager`` a list of the
discovered dag objects
-Fine-tuning your DAG processor performance
+Fine-tuning your dag processor performance
Review Comment:
Yep, I guess I misunderstood. Just updated. Thanks!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]