Hello, I have a job that runs on Databricks Runtime 11.3 LTS, so Spark 3.3.0, which is causing a NullPointerException and the stack trace does not include any reference to the job's code. Only Spark internals. Which makes me wonder whether we're dealing with a Spark bug or there's indeed an underlying cause for this.
I found a thread https://lists.apache.org/thread/x55y75sdgo334nmm0gm2r8glhy8w4j05 that reports a similar stack trace to the one I'm linking. Here's the complete stack trace: https://gist.github.com/alhuelamo/a77baa211ea4bb6febc4303cc3d9fd8a Thanks in advance for any insights you may provide! ~Alberto