ferruzzi commented on code in PR #55088:
URL: https://github.com/apache/airflow/pull/55088#discussion_r2350347083


##########
airflow-core/src/airflow/models/deadline.py:
##########
@@ -366,6 +367,76 @@ def _evaluate_with(self, *, session: Session, **kwargs: 
Any) -> datetime:
 
             return _fetch_from_db(DagRun.queued_at, session=session, **kwargs)
 
+    @dataclass
+    class AverageRuntimeDeadline(BaseDeadlineReference):
+        """A deadline that calculates the average runtime from past DAG 
runs."""
+
+        DEFAULT_LIMIT = 10
+        limit: int
+        min_runs: int | None = None
+        required_kwargs = {"dag_id"}
+
+        def __post_init__(self):
+            if self.min_runs is None:
+                self.min_runs = self.limit
+            if self.min_runs < 1:
+                raise ValueError("min_runs must be at least 1")
+
+        @provide_session
+        def _evaluate_with(self, *, session: Session, **kwargs: Any) -> 
datetime | None:
+            from airflow.models import DagRun
+
+            dag_id = kwargs["dag_id"]
+
+            # Query for completed DAG runs with both start and end dates
+            # Order by logical_date descending to get most recent runs first
+            query = (
+                select(func.extract("epoch", DagRun.end_date - 
DagRun.start_date))
+                .filter(DagRun.dag_id == dag_id, 
DagRun.start_date.isnot(None), DagRun.end_date.isnot(None))
+                .order_by(DagRun.logical_date.desc())
+            )
+
+            # Apply limit
+            query = query.limit(self.limit)
+
+            # Get all durations and calculate average
+            durations = session.execute(query).scalars().all()
+
+            if len(durations) < cast("int", self.min_runs):
+                logger.info(
+                    "Only %d completed DAG runs found for dag_id: %s (need 
%d), skipping deadline creation",
+                    len(durations),
+                    dag_id,
+                    self.min_runs,
+                )
+                return None
+            avg_seconds = sum(durations) / len(durations)
+            logger.info(
+                "Average runtime for dag_id %s (from %d runs): %.2f seconds",
+                dag_id,
+                len(durations),
+                avg_seconds,
+            )
+            return timezone.utcnow() + timedelta(seconds=avg_seconds)

Review Comment:
   `DeadlineReference.AVERAGE_RUNTIME()` doesn't do anything with `interval` 
right now.  Did you intend to add the provided interval to the average as a 
buffer?  The docs and tests seem to indicate that was the intent, but I don't 
see where it's happening.  Which also makes me wonder how those unit tests are 
passing.



##########
airflow-core/src/airflow/models/deadline.py:
##########
@@ -366,6 +367,76 @@ def _evaluate_with(self, *, session: Session, **kwargs: 
Any) -> datetime:
 
             return _fetch_from_db(DagRun.queued_at, session=session, **kwargs)
 
+    @dataclass
+    class AverageRuntimeDeadline(BaseDeadlineReference):
+        """A deadline that calculates the average runtime from past DAG 
runs."""
+
+        DEFAULT_LIMIT = 10
+        limit: int

Review Comment:
   Feel free to just resolve, but I'd suggest `max_runs` instead of `limit` to 
match your new parameter name.



##########
airflow-core/tests/unit/models/test_deadline.py:
##########
@@ -375,11 +377,140 @@ def test_deadline_database_integration(self, reference, 
expected_column, session
             if expected_column is not None:
                 result = reference.evaluate_with(session=session, 
interval=interval, **conditions)
                 mock_fetch.assert_called_once_with(expected_column, 
session=session, **conditions)
+            elif reference == DeadlineReference.AVERAGE_RUNTIME():
+                with mock.patch("airflow._shared.timezones.timezone.utcnow") 
as mock_utcnow:
+                    mock_utcnow.return_value = DEFAULT_DATE
+                    # No DAG runs exist, so it should use 24-hour default
+                    result = reference.evaluate_with(session=session, 
interval=interval, dag_id=DAG_ID)
+                    mock_fetch.assert_not_called()
+                    # Should return None when no DAG runs exist
+                    assert result is None

Review Comment:
   This needs to be updated now right, at least the comment about the 24-hour 
default?



##########
airflow-core/docs/howto/deadline-alerts.rst:
##########
@@ -97,6 +97,55 @@ Airflow provides several built-in reference points that you 
can use with Deadlin
 ``DeadlineReference.FIXED_DATETIME``
     Specifies a fixed point in time. Useful when Dags must complete by a 
specific time.
 
+``DeadlineReference.AVERAGE_RUNTIME``
+    Calculates deadlines based on the average runtime of previous DAG runs. 
This reference
+    analyzes historical execution data to predict when the current run should 
complete.
+    The deadline is set to the current time plus the calculated average 
runtime plus the interval.
+    If insufficient historical data exists, no deadline is created.
+
+    Parameters:
+        * ``limit`` (int, optional): Maximum number of recent DAG runs to 
analyze. Defaults to 10.
+        * ``min_runs`` (int, optional): Minimum number of completed runs 
required to calculate average. Defaults to same value as ``limit``.
+
+    Example usage:
+
+    .. code-block:: python
+
+        # Use default settings (analyze up to 10 runs, require 10 runs)
+        DeadlineReference.AVERAGE_RUNTIME()
+
+        # Analyze up to 20 runs but calculate with minimum 5 runs
+        DeadlineReference.AVERAGE_RUNTIME(limit=20, min_runs=5)
+
+        # Strict: require exactly 15 runs to calculate
+        DeadlineReference.AVERAGE_RUNTIME(limit=15, min_runs=15)
+
+Here's an example using average runtime:
+
+.. code-block:: python
+
+    with DAG(
+        dag_id="average_runtime_deadline",
+        deadline=DeadlineAlert(
+            reference=DeadlineReference.AVERAGE_RUNTIME(limit=15, min_runs=5),
+            interval=timedelta(minutes=30),  # Alert if 30 minutes past 
expected completion
+            callback=AsyncCallback(
+                SlackWebhookNotifier,
+                kwargs={"text": "🚨 DAG {{ dag_run.dag_id }} is running longer 
than expected!"},
+            ),
+        ),
+    ):
+        EmptyOperator(task_id="data_processing")
+
+The timeline for this example would look like this:
+
+::
+
+    |------|----------|---------|------------|--------|
+         Queued     Start    Expected    Deadline
+         09:00      09:05     09:35       10:05
+                              (avg+30min)
+

Review Comment:
   Could just be me, but this timeline is confusing me.  Maybe consider 
something like this??
   
   
   ```
   If the calculated historical average was 30 minutes, the timeline for this 
example would look like this:
   
   ::
   
       |------|----------|--------------|--------------|--------|
            Queued     Start            |           Deadline
            09:00      09:05          09:35          10:05
                         |              |              |
                         |--- Average --|-- Interval --|
                              (30 min)      (30 min)
   
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to