Thanks for your reply , Jason,

I can use stateless session in spark streaming job.

But now my question is when the rule update, how to pass it to RDD?

We generate a ruleExecutor(stateless session) in main method,

Then pass the ruleExectutor in Rdd.

 

I am new in drools, I am trying to read the drools doc now.

Best Regards,

Evan

From: Jason Nerothin [mailto:jasonnerot...@gmail.com] 
Sent: 2016年4月18日 21:42
To: yaoxiaohua
Cc: user@spark.apache.org
Subject: Re: drools on spark, how to reload rule file?

 

The limitation is in the drools implementation.

 

Changing a rule in a stateful KB is not possible, particularly if it leads to 
logical contradictions with the previous version or any other rule in the KB.

 

When we ran into this, we worked around (part of) it by salting the rule name 
with a unique id. To get the existing rules to be evaluated when we wanted, we 
kept a property on each fact that we mutated each time. 

 

Hackery, but it worked.

 

I recommend you try hard to use a stateless KB, if it is possible.

Thank you.

 

Jason

 

// brevity and poor typing by iPhone


On Apr 18, 2016, at 04:43, yaoxiaohua <yaoxiao...@outlook.com> wrote:

Hi bros,

                I am trying using drools on spark to parse log and do some rule 
match and derived some fields.

                Now I refer one blog on cloudera, 

http://blog.cloudera.com/blog/2015/11/how-to-build-a-complex-event-processing-app-on-apache-spark-and-drools/

                

                now I want to know whether it possible to reload the rule on 
the fly?

                Thanks in advance.

 

Best Regards,

Evan

Reply via email to