GitHub user Rietty created a discussion: How to deploy to AWS EKS and start up 
Airflow properly in Production?

Hello folks! I'm trying to deploy Airflow to our AWS EKS Environments 
(Dev/UAT/Prod) and am having trouble setting it up properly, specifically the 
Airflow instance I am trying to install it from the Python Package and get it 
up and running.

Currently we have some EKS stuff that we deploy so I have followed the same 
template and got it to at least deploy. However my issue is that the system 
says not to run `standalone` in production so I'm not sure what to change.

Currently I have a `pyproject.toml` with a simple project, dependencies 
(`apache-airflow==3.1.7`) and a `[[tool.uv.index]]` for our internal PyPI 
mirror.

My Dockerfile simply copies these things over, installs `uv` resolves 
dependencies via `sync` and then just uses `CMD['uv', 'run', 'airflow', 
'standalone']` however this complains about the above not to use `standalone` 
in production.

So my question is how do I properly deploy app properly so:

1. Kubernetes will properly scale workers and such when workloads become high.
2. Handle things like `postgresdb` that are required by `airflow` to function? 
I assume there's some way to have a built-in one that can be the default. (This 
is how our old/existing one works.)
3. Is there some documentation about this stuff?

Please don't suggest stuff like Amazon's Managed Airflow (It's very expensive!) 
or to use `data-on-eks` which is very strongly opinionated and will not work 
for us as we are not looking to create a new AWS EKS cluster.

GitHub link: https://github.com/apache/airflow/discussions/64196

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to