Hello Friends,

I'm new to Airflow and I'm using Airflow Celery executor with Postgres
backend and Redis Message Queue service. For now, there is 4 worker, 1
Scheduler and 1 Web Server.
I have been preparing parallel Sqoop Jobs in my daily DAGs.
When I scheduled a daily DAG, Often some task instances turning to failed
without running state after started state. Then I can't see the logs of
them. It's blank. And when I run that task manually it's running without
any problem.
I don't really understand if there is an inconsistent situation in my DAG
writing.
I have attached one of my DAGs.

Thank you in advance,
Best Regards.
Mehmet.

Reply via email to