Hi,

For some of our DAGs [where we clear and re-import the staging tables], having 
wait_for_downstream on the tasks is very important.
However, we have a BranchPythonOperator for happy and error path, so if the 
flow has successfully completed, the tasks of the error path as marked as 
skipped.
It seems that it doesn't work well with  wait_for_downstream, where airflow 
checks for the task instances with SUCCESS state only.
Is this 'by design'?
How can we handle sequential execution of the DAGs then? Should we completely 
re-design the workflow or there is a way to do that with airflow configuration 
settings?

Thanks,
Greg

Reply via email to