Hi All,
I'm with BandwidthX, a wireless tech company in San Diego.
We're trying to have one workflow tool that can be used for both business
process workflows as well as data pipelines. I think Airflow can do that. I
also think that it will be a good case study for Airflow given that I see
people using it primarily for data pipelines.

We're starting with the business process workflows first wherein a user
action can lead to the scheduling of one-time tasks e.g. activate a
particular device on a particular day/time. This task may or may not have
dependencies. A subsequent user action could potentially change the date
time of the scheduled task or could potentially cancel the already
scheduled task.

I think Airflow can do that with *schedule_interval=once* and
*start_date=scheduled_date_time*; ideally if they can be passed in as
command line parameters. I made it work by writing a python script that
takes these params and generates the script with supplied start_date for
the DAG and puts that script in DAGs folder. I also added a dependent
cleanup task to this script that actually deletes .py and .pyc files of
dynamically generated the DAG.

Is there a better way to do it? Any resource that you can point me to?

PS
I'm already part of https://gitter.im/apache/incubator-airflow.

Thanks

-- 
Dinesh Sharma
BandwidthX
[email protected]
(760) 203-4955 Ext. 121

Reply via email to