tatiana opened a new issue, #49891:
URL: https://github.com/apache/airflow/issues/49891
### Apache Airflow version
3.0.0
### If "Other Airflow 2 version" selected, which one?
_No response_
### What happened?
When I startup Airflow using airflow standalone, things work as expected:
```
dag-processor | [2025-04-28T14:19:55.455+0100] {manager.py:748} INFO -
dag-processor |
================================================================================
dag-processor | DAG File Processing Stats
dag-processor | Bundle File Path PID Current
Duration # DAGs # Errors Last Duration Last Run At
dag-processor | ----------- ------------------------------ -----
------------------ -------- ---------- --------------- -------------
dag-processor | dags-folder jaffle_shop_kubernetes.py
0 0
dag-processor | dags-folder example_virtualenv.py
0 0
dag-processor | dags-folder basic_cosmos_task_group.py
0 0
dag-processor | dags-folder hello.py
0 0
dag-processor | dags-folder example_operators.py
0 0
dag-processor | dags-folder simple_dag_async.py
0 0
dag-processor | dags-folder basic_cosmos_dag.py
0 0
dag-processor | dags-folder example_virtualenv_mini.py 86996 0.05s
0 0
dag-processor | dags-folder example_dataset_triggered.py 86994 0.06s
0 0
dag-processor | dags-folder example_dag_variable_access.py
0 0
dag-processor |
================================================================================
triggerer | [2025-04-28T14:19:55.328+0100] {triggerer_job_runner.py:161}
INFO - Starting the triggerer
```
However, after some time, the DAGs vanish from the UI and the following is
logged (the DAGs that now have "errors" are the ones that vanished from the UI:
```
dag-processor |
================================================================================
dag-processor | DAG File Processing Stats
dag-processor | Bundle File Path PID Current
Duration # DAGs # Errors Last Duration Last Run At
dag-processor | ----------- ------------------------------ -----
------------------ -------- ---------- --------------- -------------------
dag-processor | dags-folder jaffle_shop_kubernetes.py
0 1 1.14s 2025-04-28T13:20:00
dag-processor | dags-folder example_virtualenv.py
0 1 2.91s 2025-04-28T13:19:59
dag-processor | dags-folder basic_cosmos_task_group.py
0 1 1.21s 2025-04-28T13:20:00
dag-processor | dags-folder simple_dag_async.py
0 1 1.32s 2025-04-28T13:19:59
dag-processor | dags-folder basic_cosmos_dag.py
0 1 1.07s 2025-04-28T13:20:01
dag-processor | dags-folder hello.py
1 0 0.05s 2025-04-28T13:20:00
dag-processor | dags-folder example_operators.py
1 0 0.20s 2025-04-28T13:20:01
dag-processor | dags-folder example_virtualenv_mini.py
1 0 1.71s 2025-04-28T13:19:57
dag-processor | dags-folder example_dataset_triggered.py
1 0 0.68s 2025-04-28T13:19:56
dag-processor | dags-folder example_dag_variable_access.py
1 0 1.16s 2025-04-28T13:19:58
dag-processor |
================================================================================
dag-processor | [2025-04-28T14:20:25.746+0100] {manager.py:523} INFO - Not
time to refresh bundle dags-folder
```
If I stop Airflow and run `airflow dags list-import-errors`, there are no
errors::
```
No data found
```
If I reserialize `airflow dags reserialize` the DAGs or if I just restart
`airflow standalone`, these DAGs show up again in the UI - until the bundle
reparsing happens and things vanish again, as described at the beginning of the
message.
The DAGS that are not showing up are dynamically generated, and it may take
some time for them to be generated. In Airflow 2, they would be displayed - and
if I wanted to be on the safe side, I could set
`AIRFLOW__CORE__DAGBAG_IMPORT_TIMEOUT` and
`AIRFLOW__CORE__DAG_FILE_PROCESSOR_TIMEOUT `, and I would no longer experience
issues. Increasing the value of these configurations did not work in Airflow
3.0.0.
I also attempted to increase:
- `AIRFLOW__DAG_PROCESSOR__MIN_FILE_PROCESS_INTERVAL`
- `AIRFLOW__DAG_PROCESSOR__BUNDLE_REFRESH_CHECK_INTERVAL`
- `AIRFLOW__DAG_PROCESSOR__REFRESH_INTERVAL`
To set the following also did not work:
```
export AIRFLOW__DAG_PROCESSOR__DAG_BUNDLE_CONFIG_LIST='[
{
"name": "dags-folder",
"classpath": "airflow.dag_processing.bundles.local.LocalDagBundle",
"kwargs": {
"refresh_interval": 600}
}
]'
```
### What you think should happen instead?
I believe the Airflow logs should guide users advising that there was a
timeout parsing those DAGs and recommending what is the best way to increase
the time available to refresh bundles, to avoid "Not time to refresh bundle
dags-folder".
### How to reproduce
Clone this repo:
https://github.com/astronomer/astronomer-cosmos/
Export `PYTHONPATH= <your-path>/astronomer-cosmos/`
Follow the steps described in the following README and run `airflow
standalone` from this folder:
https://github.com/astronomer/astronomer-cosmos/tree/main/scripts/airflow3
### Operating System
MacOS
### Versions of Apache Airflow Providers
_No response_
### Deployment
Other
### Deployment details
Local `airflow standalone`
### Anything else?
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [x] I agree to follow this project's [Code of
Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]