On 23-10-25 11:14, Jarek Potiuk wrote:
Yes. You can send a single signal (TERM or HUP don't remember) to a celery worker and it puts it in "offline" mode where it stops accepting new tasks but will run the ones it has. Second signal will kill it . Standard celery feature - look it up in the docs. If you run it via container, you might have to make sure that you handle signal propagation and init process properly so that your signal is only sent to the celery master process and not to the spawned worker processes - see the note in "entrypoint" documentation of the airflow image https://airflow.apache.org/docs/docker-stack/entrypoint.html#signal-propagationOn Wed, Oct 25, 2023 at 9:54 AM Lars Winderling <[email protected]> wrote:Dear fellow airflow users, we have recently upgraded to a multi-worker setup, and now I'm thinking whether I can increase availability even further wrt system downtimes. Is it possible to stop celery from accepting new tasks? This way, the worker node could finish all running tasks, and only when done I do the maintenance and reboot it. Then no tasks would get canceled. I have not been able to find anything on the web (yet), so I'm kindly asking for your advice. Thank you very much in advance. Take care, Lars
OpenPGP_signature.asc
Description: OpenPGP digital signature
