This is an automated email from the ASF dual-hosted git repository.

potiuk pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/airflow.git


The following commit(s) were added to refs/heads/main by this push:
     new 0a69ba1211 note that task instance, dag and lifecycle listeners are 
non-experimental (#36376)
0a69ba1211 is described below

commit 0a69ba12115679f319516c93ba58f54fb2701c67
Author: Maciej Obuchowski <[email protected]>
AuthorDate: Fri Dec 22 23:45:46 2023 +0100

    note that task instance, dag and lifecycle listeners are non-experimental 
(#36376)
    
    Update newsfragments/1234.significant.rst
    
    Signed-off-by: Maciej Obuchowski <[email protected]>
    Co-authored-by: Elad Kalif <[email protected]>
---
 .../administration-and-deployment/listeners.rst        |  8 ++++----
 newsfragments/36376.significant.rst                    | 18 ++++++++++++++++++
 2 files changed, 22 insertions(+), 4 deletions(-)

diff --git a/docs/apache-airflow/administration-and-deployment/listeners.rst 
b/docs/apache-airflow/administration-and-deployment/listeners.rst
index 0672e07779..19f5d27646 100644
--- a/docs/apache-airflow/administration-and-deployment/listeners.rst
+++ b/docs/apache-airflow/administration-and-deployment/listeners.rst
@@ -58,6 +58,9 @@ Dataset Events
 
 Dataset events occur when Dataset management operations are run.
 
+|experimental|
+
+
 Usage
 -----
 
@@ -68,9 +71,6 @@ To create a listener:
 
 Airflow defines the specification as `hookspec 
<https://github.com/apache/airflow/tree/main/airflow/listeners/spec>`__. Your 
implementation must accept the same named parameters as defined in hookspec. If 
you don't use the same parameters as hookspec, Pluggy throws an error when you 
try to use your plugin. But you don't need to implement every method. Many 
listeners only implement one method, or a subset of methods.
 
-To include the listener in your Airflow installation, include it as a part of 
an :doc:`Airflow Plugin </authoring-and-scheduling/plugins>`
+To include the listener in your Airflow installation, include it as a part of 
an :doc:`Airflow Plugin </authoring-and-scheduling/plugins>`.
 
 Listener API is meant to be called across all DAGs and all operators. You 
can't listen to events generated by specific DAGs. For that behavior, try 
methods like ``on_success_callback`` and ``pre_execute``. These provide 
callbacks for particular DAG authors or operator creators. The logs and 
``print()`` calls will be handled as part of the listeners.
-
-
-|experimental|
diff --git a/newsfragments/36376.significant.rst 
b/newsfragments/36376.significant.rst
new file mode 100644
index 0000000000..6c37a8a8eb
--- /dev/null
+++ b/newsfragments/36376.significant.rst
@@ -0,0 +1,18 @@
+Following Listener API methods are considered stable and can be used for 
production system (were experimental feature in older Airflow versions):
+
+Lifecycle events:
+
+- ``on_starting``
+- ``before_stopping``
+
+DagRun State Change Events:
+
+- ``on_dag_run_running``
+- ``on_dag_run_success``
+- ``on_dag_run_failed``
+
+TaskInstance State Change Events:
+
+- ``on_task_instance_running``
+- ``on_task_instance_success``
+- ``on_task_instance_failed``

Reply via email to