GitHub user tony-hoff added a comment to the discussion: Airflow 3.0.6 - 
apache-airflow-providers-celery 3.12.2 - Process 'ForkPoolWorker-' exited with 
'signal 9 (SIGKILL)

>From [AWS Troubleshooting: DAGs, Operators, Connections, and other 
>issues](https://docs.aws.amazon.com/mwaa/latest/userguide/t-apache-airflow-202.html)

If your Apache Airflow 3 tasks are failing without logs, follow these steps:

If the worker logs present an error such as Task handler raised error: 
WorkerLostError('Worker exited prematurely: exitcode 15 Job: 12.') around the 
time the task failed, this indicates that the forked worker process assigned to 
the task was likely terminated unexpectedly.

To address this, consider configuring celery.worker_autoscale with the same 
minimum and maximum values. For example:
```
celery.worker_autoscale=5,5  # for mw1.small
celery.worker_autoscale=10,10 # for mw1.medium
celery.worker_autoscale=20,20 # for mw1.large
```

This ensures that the worker pool size remains fixed, preventing unexpected 
worker terminations.

GitHub link: 
https://github.com/apache/airflow/discussions/62465#discussioncomment-16120434

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to