Hi, We are trying to run airflow 1.10 in kubernetes. 1) We are running our scheduler, worker and webserver service in individual containers. 2) We are using docker image which has airflow 1.10, python 3.x. We are deploying our dags in docker image.
With above architecture of airflow setup in kubernetes, whenever we deploy dags, we need to create new docker image, kill the current running workers in airflow and restart them again with new docker image. My question is: Is killing airflow worker (starting/stopping airflow worker service )many times in a day is good and advisable ? What can be the risk installed if worker doesn't gracefully shutdown(which i have seen quite some time) ? Let me know if this is not correct place to ask. Thanks, Pramiti