The different processes, potentially running on different machines, need to
be configured in a way that they see the DAGS_FOLDER. At Airbnb we
replicate our DAGs definition repository using chef, and all the machines
share an identical `airflow.cfg` that points to the DAGS_FOLDER.

Pretty much every company has their own way to configure boxes, sync git
repos, and wrap executables into services, but the important thing is that
every Airflow process that is part of the same Airflow cluster should be
configured similarly (point to the same metadata db, point to the same DAGS
folder, have the same executor configuration, ...)

On Thu, Jan 5, 2017 at 7:32 AM, Michael Gong <go...@hotmail.com> wrote:

> Hi,
>
>
> Here is the cases:
>
> I have 2 separate airflow instances: 2 dag instance,  2 airflow server, 2
> airflow webservers.
>
> But they share the same MySQL db.
>
>
> From the 1st airflow webserver, I can see the dag instances from the 2nd
> airflow.
>
> But it is greyed out , with a message:
>
>
> "This DAG isn't available in the web server's DagBag object. It shows up
> in this list because the scheduler marked it as active in the metdata
> database."
>
>
>
> It seems making sense somehow.
>
>
> My question is how to add the 2nd dag instance to the 1st airflow web
> server's DagBag object ?
>
>
> The goal to achieve is that from the 1st airflow webserver, I can see both
> dag instances active.
>
>
> Any advices are welcomed.
>
>
> Thanks.
>
> Michael
>
>
>
>

Reply via email to