I just found out that airflow create some tmp config files in /tmp dir on
servers that run Celery workers. All the files have name tmpxxx, and the
same content like:
{"core": {"dags_folder": "/data/data366/airflow/dags", "base_log_folder":
"/data/data366/airflow/logs", "remote_logging":
:
Hello,
I am not sure, but please look at this configuration option. This is
probably the solution to your problem
https://airflow.readthedocs.io/en/latest/configurations-ref.html#celery-config-options
Best regards,
Kamil
On Wed, Apr 1, 2020 at 5:43 PM heng gu wrote:
>
> I am using
I am using Celery Executor with RabbitMQ and PostgreSQL database backend. The
celery_taskmeta table never got automatically cleaned by Airflow. In Celery
configuration, you can set CELERY_TASK_RESULT_EXPIRES, but I cannot find it in
airflow.cfg or in other config templates. Does anybody know
I was doing backfill running a one task dag. There were over 1000 dag_runs
ready to run, I set concurrency=16 and max_active_run=16, I use Celery executor
and have 12 workers. Task instances were quickly scheduled, but they stuck
there and never get run. I did a search online and got this:
: dev@airflow.apache.org
Subject: Re: task failed without running
Hi Heng,
I have same problem. Failed jobs do not dump log files. Do you have this
problem too?
Have you been using Redis as message queue? Also can you send configurations
related parallelism and concurren
jobs do not dump log files. Do you have this
problem too?
Have you been using Redis as message queue? Also can you send
configurations related parallelism and concurrent in airflow.cfg file?
Best regards,
Mehmet.
heng gu , 24 Şub 2020 Pzt, 17:51 tarihinde şunu
yazdı:
> I have this dag with a br
I have this dag with a branchpythonoperator task kicking off many of 24 tasks,
in this case, 4 tasks. 2 of the tasks were successful, the other two
(register_YZ, register_ZY) were failed without running (see the attached UI
screen shots). There is no log for tasks register_YZ and register_ZY. I