Or try Streaming SQL? Which is a simple layer on top of the Spark Streaming. ☺

https://github.com/Intel-bigdata/spark-streamingsql


From: Cassa L [mailto:lcas...@gmail.com]
Sent: Thursday, November 5, 2015 8:09 AM
To: Adrian Tanase
Cc: Stefano Baghino; user
Subject: Re: Rule Engine for Spark

Thanks for reply. How about DROOLs. Does it worj with Spark?

LCassa

On Wed, Nov 4, 2015 at 3:02 AM, Adrian Tanase 
<atan...@adobe.com<mailto:atan...@adobe.com>> wrote:
Another way to do it is to extract your filters as SQL code and load it in a 
transform – which allows you to change the filters at runtime.

Inside the transform you could apply the filters by goind RDD -> DF -> SQL -> 
RDD.

Lastly, depending on how complex your filters are, you could skip SQL and 
create your own mini-DSL that runs inside transform. I’d definitely start here 
if the filter predicates are simple enough…

-adrian

From: Stefano Baghino
Date: Wednesday, November 4, 2015 at 10:15 AM
To: Cassa L
Cc: user
Subject: Re: Rule Engine for Spark

Hi LCassa,
unfortunately I don't have actual experience on this matter, however for a 
similar use case I have briefly evaluated 
Decision<https://github.com/Stratio/Decision> (then called literally Streaming 
CEP Engine) and it looked interesting. I hope it may help.

On Wed, Nov 4, 2015 at 1:42 AM, Cassa L 
<lcas...@gmail.com<mailto:lcas...@gmail.com>> wrote:
Hi,
 Has anyone used rule engine with spark streaming? I have a case where data is 
streaming from Kafka and I need to apply some rules on it (instead of hard 
coding in a code).
Thanks,
LCassa



--
BR,
Stefano Baghino
Software Engineer @ Radicalbit

Reply via email to