[ 
https://issues.apache.org/jira/browse/AIRFLOW-5151?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

t oo updated AIRFLOW-5151:
--------------------------
    Description: 
h3. Trigger Rules documentation in airflow is a bit light

4 scenarios are not covered with recommendations of how to achieve them:

scenario 1: Someone wants a DAG with a single task but this task is a 'nice to 
have' so any failures from the task should still count towards a dagrun of 
'success'

scenario 2:  Someone wants a DAG with 3 tasks but the 1st task is a 'nice to 
have' so any failures from the task should still count towards a dagrun of 
'success' AND the 2nd/3rd tasks should still run as normal (if 2nd/3rd tasks 
fail then dagrun should fail)

scenario 3:  Someone wants a DAG with 3 tasks but the 2nd task is a 'nice to 
have' so any failures from the task should still count towards a dagrun of 
'success' AND the 1st/3rd tasks should still run  (if 1st/3rd tasks fail then 
dagrun should fail)

scenario 4:  Someone wants a DAG with 3 tasks but the 3rd task is a 'nice to 
have' so any failures from the task should still count towards a dagrun of 
'success' AND the 1st/2nd tasks should still run  (if 1st/2nd tasks fail then 
dagrun should fail)

 

notes:

a) callback_triggers are too complex for a not uncommon use case

b) with python/bash operators you can simply not check returncodes but other 
custom operators always give success/fail unless there is some way to make them 
always give success (but unlike a dummyoperator need them to actually perform 
their task ie sftpopeerator..etc)?

 

  was:h3. Trigger Rules documentation in airflow is a bit light


> Simple boolean variable for a DAGRun to ignore failures for a certain task
> --------------------------------------------------------------------------
>
>                 Key: AIRFLOW-5151
>                 URL: https://issues.apache.org/jira/browse/AIRFLOW-5151
>             Project: Apache Airflow
>          Issue Type: Improvement
>          Components: DAG, DagRun, scheduler
>    Affects Versions: 1.10.4
>            Reporter: t oo
>            Priority: Minor
>             Fix For: 2.0.0
>
>
> h3. Trigger Rules documentation in airflow is a bit light
> 4 scenarios are not covered with recommendations of how to achieve them:
> scenario 1: Someone wants a DAG with a single task but this task is a 'nice 
> to have' so any failures from the task should still count towards a dagrun of 
> 'success'
> scenario 2:  Someone wants a DAG with 3 tasks but the 1st task is a 'nice to 
> have' so any failures from the task should still count towards a dagrun of 
> 'success' AND the 2nd/3rd tasks should still run as normal (if 2nd/3rd tasks 
> fail then dagrun should fail)
> scenario 3:  Someone wants a DAG with 3 tasks but the 2nd task is a 'nice to 
> have' so any failures from the task should still count towards a dagrun of 
> 'success' AND the 1st/3rd tasks should still run  (if 1st/3rd tasks fail then 
> dagrun should fail)
> scenario 4:  Someone wants a DAG with 3 tasks but the 3rd task is a 'nice to 
> have' so any failures from the task should still count towards a dagrun of 
> 'success' AND the 1st/2nd tasks should still run  (if 1st/2nd tasks fail then 
> dagrun should fail)
>  
> notes:
> a) callback_triggers are too complex for a not uncommon use case
> b) with python/bash operators you can simply not check returncodes but other 
> custom operators always give success/fail unless there is some way to make 
> them always give success (but unlike a dummyoperator need them to actually 
> perform their task ie sftpopeerator..etc)?
>  



--
This message was sent by Atlassian JIRA
(v7.6.14#76016)

Reply via email to