Hello Everyone,


We are currently evaluating a few solutions for our next stack of network 
automation & orchestration platform (Airflow, StackStorm, BPMN based solutions, 
Django/Celery manually, etc...)

Airflow is one of the candidates.



The typical jobs would be for example:

- Get a few parameters on the type of device to deploy

- Generate the configuration from an extenal tools and push the configuration 
on various internal systems

- Wait for the device to boot and get the configuration

- Validate the device health (SSH'ing or API call on it)

- Notify on slack (and via other API calls) that the provisioning is complete



I started testing it but had a few questions I could not find answers :



- I see that is it mostly used for data processing in general, is it good for 
something else like our case ?



- It's not clear how to handle dynamic dags, for example if I have a first task 
getting a list of devices and I want the second task to be run (in parallel if 
possible) for each device ? I have read many times that Airflow is not made for 
this type of dynamic workflows, is this still the case ?



- We will probably not use the scheduled tasks features but run the tasks on 
demand (after receiving API calls)



- Is there any equivalent of the sensor feature we can find on StackStorm (that 
is also a tool we are evaluating) ( https://docs.stackstorm.com/sensors.html )



- Is there a way to validate/enforce job parameters ? Or this have to be done 
manually by checking the dict content ?



Thanks

Reply via email to