Hello fellow Airflowers,

I'm relatively new to Airflow and I'm really grateful as it already saved us some pain in production. So thanks for all the work! 🙏

Now I'm trying to set up DAG with around 20-30 tasks (BigQuery queries, Pyspark Dataproc jobs) and I've seen a weird behavior where a DAG run stops running, the DAG is marked as success but some tasks are clear. The annoying is that there's not even a sign of failure.

Do you know why this might be happening? I couldn't find a related issue on GitHub. One thing I'm suspecting is DAG importing timing out, could that cause such behavior?

(I'm using version 1.10.3.)

Thanks in advance for any pointers.

Cheers,
Gabor

Reply via email to