GitHub user shaealh added a comment to the discussion: Airflow task failed but 
spark kube app is running

Thank you @rcrchawla 
This doesn't look like a Spark specific issue since you see the same failure 
with BashOperator, the problem is likely in Airflow control plane path 
(Execution API <--> metadata DB) not Spark runtime.

One more thing to confirm: I still don't see final DB error message
can you share the exact exception line after the stack trace, for example 
sqlalchemy.exc.OperationalError ... the final line in MySQL/SQLAlchemy error 
should be enough no need for the full traceback

Second thing that would help is the exact Airflow line that says why the task 
state changed from running to failed

So far this still looks more like DB /API capacity or transient DB availability

GitHub link: 
https://github.com/apache/airflow/discussions/63298#discussioncomment-16091889

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to