Claude,
That takes the case for not writing complex tests and also for having
simple tests then build on them, ideally, one feature at a time. Nicer
when JUnit ran test in a predictable order.
In this case there is a single call to the parser. It simply avoids
needing to have a separate
parseRuleStringNegative("...") ;
Insert DRYness here.
Andy
On 24/06/14 07:16, Claude Warren wrote:
Just a quick difference of opinion.
I think that
@Test(expected=Rule.SparqlRuleParserException.class)
public void sparqlInRule_NN() { parseRuleString("...") ; }
is a bad pattern, and that tests where an exception is expected are better
served with try catch blocks catching the expected exception and perhaps
validating that the expected exception text is present. In the case noted
here if there are several places in the code where SparqlRuleParserException
could be thrown in this test, then the catch would verify that the text of
the message was as expected to determine that the correct one was thrown.
I got caught out to many times by using the expected property of the
annotation only to discover, after much hair pulling, that it was not
throwing the version of the exception I was expecting.
Claude
On Mon, Jun 23, 2014 at 9:26 PM, Andy Seaborne <[email protected]> wrote:
Hi Miguel,
We're approaching the mid-term of the Google Summer of Code project. It's
a good time to assess the state against the plan and to replan the
remainder of the time. Could you please briefly write-up how you see the
project going and see if you think any replanning is needed. Also, make
sure you are getting what you want out of the project.
Looking at the gthub repository, there are a few things I have comments on.
1/ Documentation. There will need to be documentation.
2/ Tests
There will need to be syntax test and evaluation tests, positive and
negative in both cases. Having different classes for each kind helps
organise them. SparqlInRulesTest.java seems to be heading for an all
purpose single test class. My experience is that does not help as the
testing grows. It is better to split tests up and not group them all into
one test method.
(aside:
@RunWith(Parameterized.class), my experience is that this adds very little
in the sort of situation you are in and can produce harder to understand
test failure reports.
)
One test case per @Test method so that a test is testing one thing. It
makes tracking down test failures a lot easier. Organise the tests by
starting with simple tests and then have complicated ones. When tests fail,
it can be clearer as to what failing - if some of all of the simple tests
are parsing, then it points to features only in the complex tests and also
the other way round. If the simple tests fail, its more likely a
fundamental issue, not in a feature specific piece of code.
Example:
parseRuleString(String) is a method to call the rules parser.
The test can be written succinctly with ....
@Test public void sparqlInRule_01()
{ parseRuleString("[rule1: (\\\\\\sparql .....") ; }
parseRuleString is then something like:
private static void parseRuleString(String ruleString) {
// Exception will neatly fail the test in JUnit.
List<Rule> rules = Rule.parseRules(ruleString);
Assert.assertTrue(!rules.isEmpty()) ;
... any other checks ...
}
Note: it does not catch rule parse exceptions. It lets them propagate up.
For negative syntax tests where you expect the test to fail write
something like:
@Test(expected=Rule.SparqlRuleParserException.class)
public void sparqlInRule_NN() { parseRuleString("...") ; }
then JUnit in your IDE or from Maven will produce more useful reports.
Other:
I see
/*
* To change this license header, choose License Headers in Project
Properties.
* To change this template file, choose Tools | Templates
* and open the template in the editor.
*/
This can be removed!!
Andy