Hi team,

We wanted to enable dag_run timeouts on our dags but when we have gone through 
the behavior of how dagrun_timeout works, we got to know, it only works on when 
below conditions are met.

1) dagrun should be scheduled one i.e. not manually created
2) max_active_runs must be configured

When it works:
During dagrun creation, if count of existing dagruns equate to max_active_runs 
configured and previous runs are running longer than configured timeout, it 
will be marked as 
failed(https://github.com/apache/airflow/blob/master/airflow/jobs.py#L784)

How can we achieve below:

1) With manually created dag_runs
2) Enabling timeouts on existing dag_runs without requiring triggering of new 
dag_runs
3) Though dag_run is marked as failed but running task will keep on running 
until it reaches terminal state.

Workaround:
We also explored execution_timeouts at an individual task level in combination 
with corresponding trigger rule, this works perfectly for us.

Regards,
Vardan Gupta

Reply via email to