What exactly do you mean by "alerts"?

Something specific to your data or general events of the spark cluster? For
the first, sth like Akhil suggested should work. For the latter, I would
suggest having a log consolidation system like logstash in place and use
this to generate alerts.

Regards,
Jeff

2015-03-23 7:39 GMT+01:00 Akhil Das <ak...@sigmoidanalytics.com>:

> What do you mean you can't send it directly from spark workers? Here's a
> simple approach which you could do:
>
>     val data = ssc.textFileStream("sigmoid/")
>     val dist = data.filter(_.contains("ERROR")).foreachRDD(rdd =>
> alert("Errors :" + rdd.count()))
>
> And the alert() function could be anything triggering an email or sending
> an SMS alert.
>
> Thanks
> Best Regards
>
> On Sun, Mar 22, 2015 at 1:52 AM, Mohit Anchlia <mohitanch...@gmail.com>
> wrote:
>
>> Is there a module in spark streaming that lets you listen to
>> the alerts/conditions as they happen in the streaming module? Generally
>> spark streaming components will execute on large set of clusters like hdfs
>> or Cassandra, however when it comes to alerting you generally can't send it
>> directly from the spark workers, which means you need a way to listen to
>> the alerts.
>>
>
>

Reply via email to