Re: Spark Streaming and Drools

2015-05-29 Thread Antonio Giambanco
Hi all,
I wrote a simple rule (Drools) and I'm trying to fire it, when I
fireAllRules nothing happen neither exceptions. . . do I need to setup
configurations?

Thanks

A G

2015-05-22 12:22 GMT+02:00 Dibyendu Bhattacharya 
dibyendu.bhattach...@gmail.com:

 Hi,

 Sometime back I played with Distributed Rule processing by integrating
 Drool with HBase Co-Processors ..and invoke Rules on any incoming data ..

 https://github.com/dibbhatt/hbase-rule-engine

 You can get some idea how to use Drools rules if you see this
 RegionObserverCoprocessor ..


 https://github.com/dibbhatt/hbase-rule-engine/blob/master/src/main/java/hbase/rule/HBaseDroolObserver.java


 Idea is basically to create a stateless Ruleengine from the drl file and
 fire the rule on incoming data ..

 Even though the code is for invoking rules on HBase PUT object , but you
 can get an idea ..and modify it for Spark..

 Dibyendu



 On Fri, May 22, 2015 at 3:49 PM, Evo Eftimov evo.efti...@isecc.com
 wrote:

 I am not aware of existing examples but you can always “ask” Google



 Basically from Spark Streaming perspective, Drools is a third-party
 Software Library, you would invoke it in the same way as any other
 third-party software library from the Tasks (maps, filters etc) within your
 DAG job



 *From:* Antonio Giambanco [mailto:antogia...@gmail.com]
 *Sent:* Friday, May 22, 2015 11:07 AM
 *To:* Evo Eftimov
 *Cc:* user@spark.apache.org
 *Subject:* Re: Spark Streaming and Drools



 Thanks a lot Evo,

 do you know where I can find some examples?

 Have a great one


 A G



 2015-05-22 12:00 GMT+02:00 Evo Eftimov evo.efti...@isecc.com:

 You can deploy and invoke Drools as a Singleton on every Spark Worker
 Node / Executor / Worker JVM



 You can invoke it from e.g. map, filter etc and use the result from the
 Rule to make decision how to transform/filter an event/message



 *From:* Antonio Giambanco [mailto:antogia...@gmail.com]
 *Sent:* Friday, May 22, 2015 9:43 AM
 *To:* user@spark.apache.org
 *Subject:* Spark Streaming and Drools



 Hi All,

 I'm deploying and architecture that uses flume for sending log
 information in a sink.

 Spark streaming read from this sink (pull strategy) e process al this
 information, during this process I would like to make some event
 processing. . . for example:

 Log appender writes information about all transactions in my trading
 platforms,

 if a platform user sells more than buy during a week I need to receive an
 alert on an event dashboard.

 How can I realize it? Is it possible with drools?

 Thanks so much







RE: Spark Streaming and Drools

2015-05-22 Thread Evo Eftimov
OR you can run Drools in a Central Server Mode ie as a common/shared service, 
but that would slowdown your Spark Streaming job due to the remote network call 
which will have to be generated for every single message 

 

From: Evo Eftimov [mailto:evo.efti...@isecc.com] 
Sent: Friday, May 22, 2015 11:22 AM
To: 'Evo Eftimov'; 'Antonio Giambanco'
Cc: 'user@spark.apache.org'
Subject: RE: Spark Streaming and Drools

 

The only “tricky” bit would be when you want to manage/update the Rule Base in 
your Drools Engines already running as Singletons in Executor JVMs on Worker 
Nodes. The invocation of Drools from Spark Streaming to evaluate a Rule already 
loaded in Drools is not a problem.  

 

From: Evo Eftimov [mailto:evo.efti...@isecc.com] 
Sent: Friday, May 22, 2015 11:20 AM
To: 'Antonio Giambanco'
Cc: 'user@spark.apache.org'
Subject: RE: Spark Streaming and Drools

 

I am not aware of existing examples but you can always “ask” Google 

 

Basically from Spark Streaming perspective, Drools is a third-party Software 
Library, you would invoke it in the same way as any other third-party software 
library from the Tasks (maps, filters etc) within your DAG job 

 

From: Antonio Giambanco [mailto:antogia...@gmail.com] 
Sent: Friday, May 22, 2015 11:07 AM
To: Evo Eftimov
Cc: user@spark.apache.org
Subject: Re: Spark Streaming and Drools

 

Thanks a lot Evo,

do you know where I can find some examples?

Have a great one




A G

 

2015-05-22 12:00 GMT+02:00 Evo Eftimov evo.efti...@isecc.com:

You can deploy and invoke Drools as a Singleton on every Spark Worker Node / 
Executor / Worker JVM

 

You can invoke it from e.g. map, filter etc and use the result from the Rule to 
make decision how to transform/filter an event/message 

 

From: Antonio Giambanco [mailto:antogia...@gmail.com] 
Sent: Friday, May 22, 2015 9:43 AM
To: user@spark.apache.org
Subject: Spark Streaming and Drools

 

Hi All,

I'm deploying and architecture that uses flume for sending log information in a 
sink.

Spark streaming read from this sink (pull strategy) e process al this 
information, during this process I would like to make some event processing. . 
. for example:

Log appender writes information about all transactions in my trading platforms,

if a platform user sells more than buy during a week I need to receive an alert 
on an event dashboard.

How can I realize it? Is it possible with drools?

Thanks so much

 



RE: Spark Streaming and Drools

2015-05-22 Thread Evo Eftimov
I am not aware of existing examples but you can always “ask” Google 

 

Basically from Spark Streaming perspective, Drools is a third-party Software 
Library, you would invoke it in the same way as any other third-party software 
library from the Tasks (maps, filters etc) within your DAG job 

 

From: Antonio Giambanco [mailto:antogia...@gmail.com] 
Sent: Friday, May 22, 2015 11:07 AM
To: Evo Eftimov
Cc: user@spark.apache.org
Subject: Re: Spark Streaming and Drools

 

Thanks a lot Evo,

do you know where I can find some examples?

Have a great one




A G

 

2015-05-22 12:00 GMT+02:00 Evo Eftimov evo.efti...@isecc.com:

You can deploy and invoke Drools as a Singleton on every Spark Worker Node / 
Executor / Worker JVM

 

You can invoke it from e.g. map, filter etc and use the result from the Rule to 
make decision how to transform/filter an event/message 

 

From: Antonio Giambanco [mailto:antogia...@gmail.com] 
Sent: Friday, May 22, 2015 9:43 AM
To: user@spark.apache.org
Subject: Spark Streaming and Drools

 

Hi All,

I'm deploying and architecture that uses flume for sending log information in a 
sink.

Spark streaming read from this sink (pull strategy) e process al this 
information, during this process I would like to make some event processing. . 
. for example:

Log appender writes information about all transactions in my trading platforms,

if a platform user sells more than buy during a week I need to receive an alert 
on an event dashboard.

How can I realize it? Is it possible with drools?

Thanks so much

 



RE: Spark Streaming and Drools

2015-05-22 Thread Evo Eftimov
You can deploy and invoke Drools as a Singleton on every Spark Worker Node / 
Executor / Worker JVM

 

You can invoke it from e.g. map, filter etc and use the result from the Rule to 
make decision how to transform/filter an event/message 

 

From: Antonio Giambanco [mailto:antogia...@gmail.com] 
Sent: Friday, May 22, 2015 9:43 AM
To: user@spark.apache.org
Subject: Spark Streaming and Drools

 

Hi All,

I'm deploying and architecture that uses flume for sending log information in a 
sink.

Spark streaming read from this sink (pull strategy) e process al this 
information, during this process I would like to make some event processing. . 
. for example:

Log appender writes information about all transactions in my trading platforms,

if a platform user sells more than buy during a week I need to receive an alert 
on an event dashboard.

How can I realize it? Is it possible with drools?

Thanks so much



Re: Spark Streaming and Drools

2015-05-22 Thread Dibyendu Bhattacharya
Hi,

Sometime back I played with Distributed Rule processing by integrating
Drool with HBase Co-Processors ..and invoke Rules on any incoming data ..

https://github.com/dibbhatt/hbase-rule-engine

You can get some idea how to use Drools rules if you see this
RegionObserverCoprocessor ..

https://github.com/dibbhatt/hbase-rule-engine/blob/master/src/main/java/hbase/rule/HBaseDroolObserver.java


Idea is basically to create a stateless Ruleengine from the drl file and
fire the rule on incoming data ..

Even though the code is for invoking rules on HBase PUT object , but you
can get an idea ..and modify it for Spark..

Dibyendu



On Fri, May 22, 2015 at 3:49 PM, Evo Eftimov evo.efti...@isecc.com wrote:

 I am not aware of existing examples but you can always “ask” Google



 Basically from Spark Streaming perspective, Drools is a third-party
 Software Library, you would invoke it in the same way as any other
 third-party software library from the Tasks (maps, filters etc) within your
 DAG job



 *From:* Antonio Giambanco [mailto:antogia...@gmail.com]
 *Sent:* Friday, May 22, 2015 11:07 AM
 *To:* Evo Eftimov
 *Cc:* user@spark.apache.org
 *Subject:* Re: Spark Streaming and Drools



 Thanks a lot Evo,

 do you know where I can find some examples?

 Have a great one


 A G



 2015-05-22 12:00 GMT+02:00 Evo Eftimov evo.efti...@isecc.com:

 You can deploy and invoke Drools as a Singleton on every Spark Worker Node
 / Executor / Worker JVM



 You can invoke it from e.g. map, filter etc and use the result from the
 Rule to make decision how to transform/filter an event/message



 *From:* Antonio Giambanco [mailto:antogia...@gmail.com]
 *Sent:* Friday, May 22, 2015 9:43 AM
 *To:* user@spark.apache.org
 *Subject:* Spark Streaming and Drools



 Hi All,

 I'm deploying and architecture that uses flume for sending log information
 in a sink.

 Spark streaming read from this sink (pull strategy) e process al this
 information, during this process I would like to make some event
 processing. . . for example:

 Log appender writes information about all transactions in my trading
 platforms,

 if a platform user sells more than buy during a week I need to receive an
 alert on an event dashboard.

 How can I realize it? Is it possible with drools?

 Thanks so much





RE: Spark Streaming and Drools

2015-05-22 Thread Evo Eftimov
The only “tricky” bit would be when you want to manage/update the Rule Base in 
your Drools Engines already running as Singletons in Executor JVMs on Worker 
Nodes. The invocation of Drools from Spark Streaming to evaluate a Rule already 
loaded in Drools is not a problem.  

 

From: Evo Eftimov [mailto:evo.efti...@isecc.com] 
Sent: Friday, May 22, 2015 11:20 AM
To: 'Antonio Giambanco'
Cc: 'user@spark.apache.org'
Subject: RE: Spark Streaming and Drools

 

I am not aware of existing examples but you can always “ask” Google 

 

Basically from Spark Streaming perspective, Drools is a third-party Software 
Library, you would invoke it in the same way as any other third-party software 
library from the Tasks (maps, filters etc) within your DAG job 

 

From: Antonio Giambanco [mailto:antogia...@gmail.com] 
Sent: Friday, May 22, 2015 11:07 AM
To: Evo Eftimov
Cc: user@spark.apache.org
Subject: Re: Spark Streaming and Drools

 

Thanks a lot Evo,

do you know where I can find some examples?

Have a great one




A G

 

2015-05-22 12:00 GMT+02:00 Evo Eftimov evo.efti...@isecc.com:

You can deploy and invoke Drools as a Singleton on every Spark Worker Node / 
Executor / Worker JVM

 

You can invoke it from e.g. map, filter etc and use the result from the Rule to 
make decision how to transform/filter an event/message 

 

From: Antonio Giambanco [mailto:antogia...@gmail.com] 
Sent: Friday, May 22, 2015 9:43 AM
To: user@spark.apache.org
Subject: Spark Streaming and Drools

 

Hi All,

I'm deploying and architecture that uses flume for sending log information in a 
sink.

Spark streaming read from this sink (pull strategy) e process al this 
information, during this process I would like to make some event processing. . 
. for example:

Log appender writes information about all transactions in my trading platforms,

if a platform user sells more than buy during a week I need to receive an alert 
on an event dashboard.

How can I realize it? Is it possible with drools?

Thanks so much

 



Re: Spark Streaming and Drools

2015-05-22 Thread Antonio Giambanco
Thanks a lot Evo,
do you know where I can find some examples?

Have a great one

A G

2015-05-22 12:00 GMT+02:00 Evo Eftimov evo.efti...@isecc.com:

 You can deploy and invoke Drools as a Singleton on every Spark Worker Node
 / Executor / Worker JVM



 You can invoke it from e.g. map, filter etc and use the result from the
 Rule to make decision how to transform/filter an event/message



 *From:* Antonio Giambanco [mailto:antogia...@gmail.com]
 *Sent:* Friday, May 22, 2015 9:43 AM
 *To:* user@spark.apache.org
 *Subject:* Spark Streaming and Drools



 Hi All,

 I'm deploying and architecture that uses flume for sending log information
 in a sink.

 Spark streaming read from this sink (pull strategy) e process al this
 information, during this process I would like to make some event
 processing. . . for example:

 Log appender writes information about all transactions in my trading
 platforms,

 if a platform user sells more than buy during a week I need to receive an
 alert on an event dashboard.

 How can I realize it? Is it possible with drools?

 Thanks so much