You need to call sc.stop() to wait for the notifications to be processed.
Best Regards,
Shixiong(Ryan) Zhu
2015-04-21 4:18 GMT+08:00 Praveen Balaji secondorderpolynom...@gmail.com:
Thanks Shixiong. I tried it out and it works.
If you're looking at this post, here a few points you may be
The problem is the code you use to test:
sc.parallelize(List(1, 2, 3)).map(throw new
SparkException(test)).collect();
is like the following example:
def foo: Int = Nothing = {
throw new SparkException(test)
}
sc.parallelize(List(1, 2, 3)).map(foo).collect();
So actually the Spark jobs do not
Thanks Shixiong. I'll try this.
On Sun, Apr 19, 2015, 7:36 PM Shixiong Zhu zsxw...@gmail.com wrote:
The problem is the code you use to test:
sc.parallelize(List(1, 2, 3)).map(throw new
SparkException(test)).collect();
is like the following example:
def foo: Int = Nothing = {
throw
Thanks for the response, Archit. I get callbacks when I do not throw an
exception from map.
My use case, however, is to get callbacks for exceptions in transformations
on executors. Do you think I'm going down the right route?
Cheers
-p
On Sat, Apr 18, 2015 at 1:49 AM, Archit Thakur
Hi Praveen,
Can you try once removing throw exception in map. Do you still not get it.?
On Apr 18, 2015 8:14 AM, Praveen Balaji secondorderpolynom...@gmail.com
wrote:
Thanks for the response, Imran. I probably chose the wrong methods for
this email. I implemented all methods of SparkListener
I'm trying to create a simple SparkListener to get notified of error on
executors. I do not get any call backs on my SparkListener. Here some
simple code I'm executing in spark-shell. But I still don't get any
callbacks on my listener. Am I doing something wrong?
Thanks for any clue you can send
when you start the spark-shell, its already too late to get the
ApplicationStart event. Try listening for StageCompleted or JobEnd instead.
On Fri, Apr 17, 2015 at 5:54 PM, Praveen Balaji
secondorderpolynom...@gmail.com wrote:
I'm trying to create a simple SparkListener to get notified of
Thanks for the response, Imran. I probably chose the wrong methods for this
email. I implemented all methods of SparkListener and the only callback I
get is onExecutorMetricsUpdate.
Here's the complete code:
==
import org.apache.spark.scheduler._
sc.addSparkListener(new SparkListener()