Hi Maulik,
I believe you can already achieve what you want using the existing git
sync container. We are using git sync container to pull dags from S3
for every task launched by k8s executor and it's been working very
well for us so far.
Thanks,
QP
On Fri, Oct 18, 2019 at 11:21 AM Maulik Soneji
Hello,
Cloud Composer stores the source code for your workflows (DAGs) and
their dependencies in specific folders in Cloud Storage and uses Cloud
Storage FUSE to map the folders to the Airflow instances in Cloud
Composer environment.
More info:
https://cloud.google.com/composer/docs/concepts/over
Hi Kamil,
Thank you very much for this suggestion. I will certainly try this out.
There is one more aspect of this which is updating the airflow deployments
of the webserver and scheduler, these need to be updated as well with the
latest dags. Any thoughts on how do we support updating the dags h
Hello,
Why not just add the initialization container with pod mutation? The
container can contain a google/cloud-sdk image and then run the gsutil
rsync -m 16 ... command. Then we will not have to write any code and
it will be a solution similar to Kubernetes thinking, where one
container contains
*[Proposal]*
Create a new *syncer *command to sync dags from any remote folder, which
will be used as initContainer command in KubernetesExecutor.
It is just like the initdb command, but it will copy dags from the remote
folder before running the dag.
*[Problem]*
Currently, there are only two ways