Thank you again, Steve.


We need to make all kind of changes to the rules, however, small changes are 
more frequent than significant ones. So, every time we are deploying a new 
rule, there is a risk of it either not compiling or failing properly follow 
business logic.



We can trust users with any changes, however, moving the code to production is 
a big deal. This should be vetted by an authority figure, and there must be a 
simple and transparent rollback plan.

Yes, we want to be very risk-averse.



Ideally, we would like to have:

1.  A staging environment where automation tests are run;

2.  A change can be deployed to production only if all automation tests have 
passed;

3.  Some kind of administration console from which a change can be manually 
deployed to production (via uploading to production Maven repository or in some 
other way);

4.  Production Drools system picking up new changes without interruption of 
service;

5.  Production console function allowing a one-click rollback of a recent 
change;



Alex



P.S. Sorry, it looks like I have not mastered proper replying to a forum thread.





> Pretty much correct.

>

> re. 5 - It depends on what you mean by it becoming clear that a release is a 
> bad one.

>

> I have tended to code up my own knowledge base reloads and check for errors, 
> but I'm pretty sure that if your rules don't compile, then neither the 
> KnowledgeAgent nor the KieScanner will deploy them. If you use Guvnor, then 
> your project will not be built and packaged if the rules don't compile.

>

> However, if the problem is that the new rules are just 'wrong' within your 
> domain, then it's hard to think of any way in which that could be detected > 
> automatically, other than by you yourself writing the validation.

>

> To help with this, I have previously set up a FitNesse server which would 
> load in the latest rules and evaluate them, ensuring that output expectations 
> are met. However, no such test suite is perfect. It may be that a change is 
> made which needs a new test to evaluate it. If that test is not written, then 
> the suite of tests still passes.

>

> Similarly, you can write unit tests for the build. You can deploy to a 
> staging server, where the rules can be evaluated with as-live data, so that 
> you can regression test the rules service in isolation from the rest of your 
> application.

>

> Looking at rollback, in one Guvnor-based system, I have the users take a 
> snapshot for each rules deployment. They then copy that snapshot to an 
> "approved" snapshot. This way, rollback is just a case of copying the 
> previous version to "approved" and deploying that. The users are legal and 
> back office operations teams, and they are pretty efficient at following this 
> process these days.

>

> However, in the end it comes down to things like:

> What kind of rule changes do users typically make? i.e. Are they just 
> changing some numbers in existing decision tables?

> Can you trust the users to only make non-risky changes? Guvnor won't stop 
> them from altering the structure of decision tables, or adding new 
> non-decision-table rules.

> How risk-averse are you?

>

> Steve
_______________________________________________
rules-users mailing list
rules-users@lists.jboss.org
https://lists.jboss.org/mailman/listinfo/rules-users

Reply via email to