ferruzzi commented on code in PR #55088:
URL: https://github.com/apache/airflow/pull/55088#discussion_r2350261303


##########
airflow-core/src/airflow/models/deadline.py:
##########
@@ -366,6 +366,62 @@ def _evaluate_with(self, *, session: Session, **kwargs: 
Any) -> datetime:
 
             return _fetch_from_db(DagRun.queued_at, session=session, **kwargs)
 
+    @dataclass
+    class AverageRuntimeDeadline(BaseDeadlineReference):
+        """A deadline that calculates the average runtime from past DAG 
runs."""
+
+        DEFAULT_LIMIT = 10
+        limit: int
+        required_kwargs = {"dag_id"}
+
+        @provide_session
+        def _evaluate_with(self, *, session: Session, **kwargs: Any) -> 
datetime:
+            from airflow.models import DagRun
+
+            dag_id = kwargs["dag_id"]
+
+            # Query for completed DAG runs with both start and end dates
+            # Order by logical_date descending to get most recent runs first
+            query = (
+                select(func.extract("epoch", DagRun.end_date - 
DagRun.start_date))
+                .filter(DagRun.dag_id == dag_id, 
DagRun.start_date.isnot(None), DagRun.end_date.isnot(None))
+                .order_by(DagRun.logical_date.desc())
+            )
+
+            # Apply limit
+            query = query.limit(self.limit)
+
+            # Get all durations and calculate average
+            durations = session.execute(query).scalars().all()
+
+            if len(durations) < self.limit:
+                logger.warning(
+                    "In the AverageRuntimeDeadline: Only %d completed DAG runs 
found for dag_id: %s (need %d), using 48 hour default",
+                    len(durations),
+                    dag_id,
+                    self.limit,
+                )
+                avg_seconds = 48 * 3600  # 48 hours as default

Review Comment:
   Thinking this through, 
   
   By adding some kind of "don't store this deadline" escape hatch, then the 
deadline can never expire, so the first run wouldn't ever fail deadline, but 
the second would use the first run's duration as the "average" and then build 
up from there over time.
   
   We would just have t make it clear that "no historical data means no failure"



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to