fuguixing edited a comment on issue #17429: URL: https://github.com/apache/airflow/issues/17429#issuecomment-894013281
Thanks for your reply! In our env, I need to make the new specified DAG file available immediately without waiting for Airflow call **_refresh_dag_dir**(in the dag_processing.py), if **dag_dir_list_interval** is too small, which may cause a waste of resources. I tried the following code, which can load the new specified DAG file to the database and make it available immediately, and it uses **DagFileProcessor** in the scheduler_job.py. Can I encapsulate it as a restful API? heartbeat() in the dag_processing.py `processor = self._processor_factory(file_path, self._zombies)` `processor.start()` -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: commits-unsubscr...@airflow.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org