Benjamin Bentmann wrote:

For my curiosity: What would be the benefit of setting up a hand-crafted
test suite? I am a lazy guy and prefer the dumb machine to do the nasty
things for me so I really like the idea of just dropping a test class into
src/test/java without bothering to additionally maintain some test suite,
too.

Typically it's used for slow tests; you can identify a subset of tests that you want to run and just run those. (You can even pull just one method from Class X, and another method from Class Y, and so on.)

My personal concerns are
a) the console output from Surefire during test execution and
b) the redirected test output files (*-output.txt)

Surefire's design is to hand off control to the testing framework (JUnit/TestNG) and then report on the console when we get control back. For JUnit, we hand off control on a "class" level, so we get control back after every class. For TestNG, we have to hand it the entire directory, because any one class may @dependsOn methods in other classes.

Furthermore, TestNG doesn't notify us at the start/end of every class; indeed, it's easy to configure TestNG so that individual class methods run out of order (so that some of Class X runs and then some of Class Y, and then back to Class X, then Class Y finishes, etc.) and that's without considering parallelized testing.

We could log to the console on individual test methods as they run, though I'm sure we'd want to make it a configurable option. I've filed SUREFIRE-437 for the console logging issue.

As for splitting them all up into separate txt files, the only way we could do it would be to split the tests up into one txt file per METHOD, which is way too many files IMO.

With Surefire 2.3.1, I got a result feedback every half minute when a test class finished.

It's odd that you describe this as a 2.3.1 regression... I couldn't get TestNG to work at all in Surefire 2.3.x, hence my work on Surefire 2.4.

-Dan

Reply via email to