Hi there, I am trying to create a listener for my Spark job to do some additional notifications for failures using this Scala API: https://spark.apache.org/docs/1.2.1/api/scala/#org.apache.spark.scheduler.JobResult .
My idea was to write something like this: override def onJobEnd(jobEnd: SparkListenerJobEnd): Unit = { jobEnd.jobResult match { case JobFailed(exception) => //do stuff here } } However, JobFailed class is package private, and thus I cannot do this. It's sibling class, JobSucceeded is public, but obviously I want to handle failed scenarios and be able to introspect the exception. I did notice that the corresponding class in the Java API is public. Is there another pattern I should follow to handle failures? Thanks!