bugraoz93 commented on issue #49330:
URL: https://github.com/apache/airflow/issues/49330#issuecomment-2810667516

   > @bugraoz93 , We dont use docker at all (and have no access to it either). 
We use virtualenvs. As mentioned, the setup here is pretty light weight where 
`airflow` cli comes from a local venv where airflow is installed. We just have 
these three env variables that go with it.
   > 
   > ```
   > AIRFLOW_HOME = '<dags_repo>/run'
   > AIRFLOW__CORE__DAGS_FOLDER = <dags_repo>
   > AIRFLOW__CORE__LOAD_EXAMPLES = false
   > ```
   > Being able to list dags and list import errors without having any services 
running is a valid and wide spread usecase. This is how folks sanity check 
their dags and maybe even run `airflow standalone` for a quick dry run before 
they deploy to the central airflow service
   > 
   
   I am saying that Dag Processors should be running when deploying Airflow 
3.0+ to any environment. I don’t know the details of your setup but when you 
say local, I understand as it is local machine you are using for development if 
that is running the airflow api-server, scheduler or triggerer in a venv, then 
yes Dag Processors should be running there too. Even in on-premise server or 
docker that shouldn't really make any difference in this scenario. If venv is 
remote and you are connecting, not in your personal local machines but 
somewhere in a server then it should be running there


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to