Hi All,

In our workflows we trigger big data jobs which run from few hours to few days. 
Currently our Airflow operator submits the job and keeps on polling its status. 
Depending upon its status next task in the workflow is triggered by Airflow 
scheduler.
So currently operator is not doing any useful work but is occupying one worker 
slot.
we are exploring if we can do following 
 -> Operator Submits the Big Data Job and mark itself success
 -> Once that job finishes it sends a notification either through callback or 
through some event 
 -> Based on notification Dependent task in the workflow is triggered and 
starts running.

We might need to introduce new operator state like waiting ..
Does it make sense?. 
Is there any way/workaround to achieve this.

Thanks,
Raman Gupta
   

Reply via email to