We have been using our own flavor of Fit for Rules (which is build on top of fit) for about 1 1/2 years now to test our business logic written in JBoss Rules 5. It’s relatively easy to get the Business Analyst on board since he is using his tool (which is Microsoft Excel) to communicate test cases for the rules. So in theory, he writes the tests in Excel, we do the rules coding and voila, all tests turn green.
Reality is, we have to tweak the Excel sheets. We need to put in imports of our fact model, insert facts and create objects within that not so programmer friendly table environment. A couple of days ago we got the request to tweak some rules and we all had to start doing rules again (and we used to use Eclipse for writing rules because that’s the only IDE having a plugin for that).
After half a day of coding Java syntax in Excel sheets we decided that the ramp up time for the not so knowledgeable rules/fit programmers like me is too much. With debugging, copy and paste we spent easily 5-10 times more time on making the tests work than writing the code itself. Test driven design is not really an option here, since you need to know the imports of the rules file to get the sheet even to compile.
So what did we do ? Well why not try to get things working the way we used to do it ? TestNG anyone ?
There are many pros to use unit testing but also some cons. The biggest issue is that we will loose the direct communication to the business analyst. It’s always better if someone else writes the test and I just have to implement the solution. Maybe we can find another solution involving Active Spec or DSL. For now we stick to unit tests and the task the we have to make sure we convert every Excel test case to java code (but hey, that’s what our code reviews are for).
Checkout our current base class for testing our rules:
01.
public
abstract
class
AbstractRulesTest {
02.
public
abstract
String[] getRulesFileNames();
03.
private
final
String GET_FINDINGS =
"import com.maxheapsize.RulesFinding;"
+
04.
"query \"getAllRulesFindings\"\n"
+
05.
" finding : FRulesFinding()\n"
+
06.
"end"
;
07.
08.
private
static
Logger LOG = Logger.getLogger(AbstractRulesTest.
class
);
09.
10.
public
final
List<FRulesFinding> fireRules(Set factsForWorkingMemory) {
11.
KnowledgeBase ruleBase = setUpKnowledgeBase();
12.
return
fireRules(ruleBase, factsForWorkingMemory);
13.
}
14.
15.
public
KnowledgeBase setUpKnowledgeBase() {
16.
KnowledgeBaseConfiguration configuration = KnowledgeBaseFactory.newKnowledgeBaseConfiguration();
17.
KnowledgeBase ruleBase = KnowledgeBaseFactory.newKnowledgeBase(configuration);
18.
19.
KnowledgeBuilder build = KnowledgeBuilderFactory.newKnowledgeBuilder();
20.
build.add(ResourceFactory.newReaderResource(
new
StringReader(GET_FINDINGS)), ResourceType.DRL);
21.
String[] fileNames = getRulesFileNames();
22.
for
(String fileName : fileNames) {
23.
File userDefinedFile =
new
File(fileName);
24.
build.add(ResourceFactory.newFileResource(userDefinedFile), ResourceType.DRL);
25.
}
26.
27.
handleBuilderErrors(build);
28.
29.
ruleBase.addKnowledgePackages(build.getKnowledgePackages());
30.
return
ruleBase;
31.
}
32.
33.
private
void
handleBuilderErrors(KnowledgeBuilder build) {
34.
if
(build.hasErrors()) {
35.
KnowledgeBuilderErrors knowledgeBuilderErrors = build.getErrors();
36.
for
(KnowledgeBuilderError knowledgeBuilderError : knowledgeBuilderErrors) {
37.
int
[] ints = knowledgeBuilderError.getErrorLines();
38.
LOG.error(
"Error at : "
+ints[
0
]+
" : "
+ints[
1
]);
39.
LOG.error(knowledgeBuilderError.getMessage());
40.
}
41.
}
42.
}
43.
44.
private
List<FRulesFinding> fireRules(KnowledgeBase ruleBase, Set facts) {
45.
List<FRulesFinding> result =
new
ArrayList<FRulesFinding>();
46.
StatefulKnowledgeSession statefulSession = ruleBase.newStatefulKnowledgeSession();
47.
for
(Object fact : facts) {
48.
statefulSession.insert(fact);
49.
}
50.
statefulSession.fireAllRules();
51.
52.
QueryResults results = statefulSession.getQueryResults(
"getAllRulesFindings"
);
53.
try
{
54.
FRulesFinding finding = (FRulesFinding) results.iterator().next().get(
"finding"
);
55.
result.add(finding);
56.
}
57.
catch
(NoSuchElementException e) {
58.
result =
new
ArrayList<FRulesFinding>();
59.
}
60.
return
result;
61.
}
62.
}
All my rules insert a RulesFinding (and only one at the moment) into the working memory when triggered. The rest is pretty easy. You subclass it, overwrite getRulesFileNames and call fireRules with a set of objects (your tests) which need be insert into the working memory. To get the finding back you need to execute an already inserted query which needs to have an identifier (line 3, 20, 52, 54). It will contain the result of your rule execution.
Sample code would look like this:
01.
public
class
RulesTest
extends
AbstractRulesTest {
02.
03.
private
Set facts;
04.
05.
@BeforeMethod
06.
public
void
setUp() {
07.
facts =
new
HashSet();
08.
}
09.
10.
@Override
11.
public
String[] getRulesFileNames() {
12.
return
new
String[]{
13.
"src/main/rules/myrules.drl"
,
14.
"src/main/rules/generealRules.drl"
15.
};
16.
}
17.
18.
@Test
19.
public
void
testDemoRule() {
20.
21.
FMyFact myFact =
new
FMyFact();
22.
myFact.setColor(
"green"
);
23.
facts.add(myFact);
// add all your facts here
24.
25.
List<FRulesFinding> findings = fireRules(facts);
26.
Assert.assertTrue(findings.size() ==
1
);
27.
FRulesFinding finding = findings.get(
0
);
28.
Assert.assertTrue(finding.getStatus() == FStatus.OK);
29.
}
30.
}
Depending on how you cut your rules you can extract the assertion of the status.
Each test case in Excel now takes about 5-10 lines of java code. Considering we are covering each rule with about 5-15 test cases and boundary conditions this amounts to 75-150 lines of test code. I take that any day over programming in Excel.