Our DAG (hourly) has 10 tasks (all of them Bash Operators - issuing curl
commands).
We run airflow on docker.

When we do a backfill for, say last 10 days, we see that airflow
consistently hits the memory limit (4gb) and the container dies (OOM
Killed).

We increased the memory to 8gb. I still see mem utilization to be around
90%.

when I do ps -ef, I see a lot of backfill processes. All the process
running the same command. I use the pid and know more about each process
(process sys var etc).
All these processes are exactly the same. Why so many processes?


Also, my worry is really how much memory is enough? How is the memory
management done (object pools etc) ?

Reply via email to