You can write a function that returns a DAG object. Once you call that
function and assign the output to a variable, airflow will recognize it and
schedule it.

example:

def setup_dag(input1, schedule):
    dag = DAG(dag_id=input1, schedule=schedule, etc.)

    other things based on inputs

    return dag

some_dag = setup_dag('great_dag', '0 * * * *', etc)
some_other_dag = setup_dag('good_dag', '0 0 * * *', etc)


Hakan

On Wed, Nov 1, 2017 at 12:12 PM, Michael Crawford <
[email protected]> wrote:

> Right that would be the case if I am running multiple airflows.
>
> In this case I have all the etl’s running in the same airflow so I have a
> separate dag for each.
>
> Like if this was object oriented I would just make a base dag and then
> extend it to make all of the different env ones.
>
>
>
> I
> > On Nov 1, 2017, at 2:45 PM, Joy Gao <[email protected]> wrote:
> >
> > Hi Michael,
> >
> > You could leverage the Variable feature in Airflow: start by setting a
> > different environment variable in each env, and then in your DAG file,
> > instead of hard-coding the schedule, call a method that looks up which
> > environment it is (using Variable.get) and returns the desired one.
> >
> > Hope this helps!
> >
> > On Wed, Nov 1, 2017 at 11:10 AM, Michael Crawford <
> > [email protected]> wrote:
> >
> >> Hi All,
> >>
> >> Is there a best practice regarding registering similar DAGs which only
> >> differ in small variables.
> >>
> >> For instance say I have an certain ETL that I want to run on several
> >> different environments on different schedules.
> >>
> >> All of the DAGs would essentially be the exactly the same just with a
> few
> >> different parameters.
> >>
> >> It doesn’t seem like duplicating the DAG code over and over is the right
> >> way to do this.
> >>
> >> Thanks,
> >> Mike
> >>
>
>

Reply via email to