I'm trying to create a simple SparkListener to get notified of error on
executors. I do not get any call backs on my SparkListener. Here some
simple code I'm executing in spark-shell. But I still don't get any
callbacks on my listener. Am I doing something wrong?
Thanks for any clue you can send my way.
Cheers
Praveen
======
import org.apache.spark.scheduler.SparkListener
import org.apache.spark.scheduler.SparkListenerApplicationStart
import org.apache.spark.scheduler.SparkListenerApplicationEnd
import org.apache.spark.SparkException
sc.addSparkListener(new SparkListener() {
override def onApplicationStart(applicationStart:
SparkListenerApplicationStart) {
println(">>>> onApplicationStart: " + applicationStart.appName);
}
override def onApplicationEnd(applicationEnd:
SparkListenerApplicationEnd) {
println(">>>> onApplicationEnd: " + applicationEnd.time);
}
});
sc.parallelize(List(1, 2, 3)).map(throw new
SparkException("test")).collect();
=======
output:
scala> org.apache.spark.SparkException: hshsh
at $iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
at $iwC$$iwC$$iwC.<init>(<console>:34)
at $iwC$$iwC.<init>(<console>:36)
at $iwC.<init>(<console>:38)