luoyuliuyin commented on code in PR #39484:
URL: https://github.com/apache/airflow/pull/39484#discussion_r1596007476
##########
airflow/providers/celery/executors/celery_executor.py:
##########
@@ -267,6 +267,7 @@ def __init__(self):
self.tasks = {}
self.task_publish_retries: Counter[TaskInstanceKey] = Counter()
self.task_publish_max_retries = conf.getint("celery",
"task_publish_max_retries")
+ self.send_pool =
ProcessPoolExecutor(max_workers=self._sync_parallelism)
Review Comment:
When I did performance testing, I also used a dag composed of a simple
BashOperator. You may not see the difference when the load is relatively low.
When I tested, the load of the airflow scheduler was very high. There were
40,000 scheduled dags (every 2 hours) and 10,000 manually triggered dags. The
number of schedulers was 30, and the number of workers was 100.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]