Hi Heng, I have same problem. Failed jobs do not dump log files. Do you have this problem too? Have you been using Redis as message queue? Also can you send configurations related parallelism and concurrent in airflow.cfg file?
Best regards, Mehmet. heng gu <heng...@yahoo.com.invalid>, 24 Şub 2020 Pzt, 17:51 tarihinde şunu yazdı: > I have this dag with a branchpythonoperator task kicking off many of 24 > tasks, in this case, 4 tasks. 2 of the tasks were successful, the other two > (register_YZ, register_ZY) were failed without running (see the attached UI > screen shots). There is no log for tasks register_YZ and register_ZY. I am > using Celery Executor and running 12 workers executing register_XX tasks. I > am using airflow version 1.10.6. Any idea how to fix it? > -- Mehmet ERSOY