You can deploy and invoke Drools as a Singleton on every Spark Worker Node / Executor / Worker JVM
You can invoke it from e.g. map, filter etc and use the result from the Rule to make decision how to transform/filter an event/message From: Antonio Giambanco [mailto:antogia...@gmail.com] Sent: Friday, May 22, 2015 9:43 AM To: user@spark.apache.org Subject: Spark Streaming and Drools Hi All, I'm deploying and architecture that uses flume for sending log information in a sink. Spark streaming read from this sink (pull strategy) e process al this information, during this process I would like to make some event processing. . . for example: Log appender writes information about all transactions in my trading platforms, if a platform user sells more than buy during a week I need to receive an alert on an event dashboard. How can I realize it? Is it possible with drools? Thanks so much