Hello Airflow Dev,
I am using Airflow to schedule, orchestrate and monitor Data pipelines. My
airflow.cfg is default, I haven't change any attribute value yet.
*Main Dag:*
dag = DAG(
dag_id=DAG_NAME,
default_args=args,
schedule_interval=None,
concurrency=8
)
I have 8 another sub-dags with similar setting:
group_one_parquet = SubDagOperator(
executor=LocalExecutor(),
task_id='group_one_parquet',
subdag=group_one_parquet_subdag(DAG_NAME, 'group_one_parquet' , args,
network_id,schema_name, env_name),
default_args=args,
dag=dag,
)
Now when I am running this same DAG for let's say 5 times parallely with
different data points passed explicitly with **kwargs.
I am getting an error as Deadlock!,
1) The maximum number of running tasks (8) for this task's DAG 'xya' has
reached
2) BackfillJob is deadlocked at log.
[image: image (3).png]
I would like to Scale LocalExecutor as vertically only because of certain
limitation.
Can someone please throw light from experience.
Thanks