I got an advice from slack to check airflow-worker.err, and here is what I 
found
  File "python3.6/site-packages/airflow/executors/celery_executor.py", line 67, 
in execute_command    close_fds=True, env=env)  File 
"pypacks/lib/python3.6/subprocess.py", line 311, in check_call    raise 
CalledProcessError(retcode, cmd)
I am using rabbitmq.  And here is the parallelism and concurrency set in cfg 
file:
# The amount of parallelism as a setting to the executor. This defines# the max 
number of task instances that should run simultaneously# on this airflow 
installationparallelism = 32
# The number of task instances allowed to run concurrently by the 
schedulerdag_concurrency = 16

for this dag, I set concurrency=16, max_active_runs=8
Thanks,
Heng
    On Wednesday, February 26, 2020, 03:18:27 AM EST, Mehmet Ersoy 
<mehmet.ersoy1...@gmail.com> wrote:  
 
 Hi Heng,
I have same problem. Failed jobs do not dump log files. Do you have this
problem too?
Have you been using Redis as message queue? Also can you send
configurations related parallelism and concurrent in airflow.cfg file?

Best regards,
Mehmet.

heng gu <heng...@yahoo.com.invalid>, 24 Şub 2020 Pzt, 17:51 tarihinde şunu
yazdı:

> I have this dag with a branchpythonoperator task kicking off many of 24
> tasks, in this case, 4 tasks. 2 of the tasks were successful, the other two
> (register_YZ, register_ZY) were failed without running (see the attached UI
> screen shots). There is no log for tasks register_YZ and register_ZY. I am
> using Celery Executor and running 12 workers executing register_XX tasks. I
> am using airflow version 1.10.6. Any idea how to fix it?
>


-- 
Mehmet ERSOY  

Reply via email to