> Date: Tue, 25 Nov 2008 23:31:17 +0100
 > From: [email protected]
 > 
 > Hi.
 > 
 > I have committed several new facilities for the testing of the
 > library.
 > 
 > Now we have a place to put the data files needed by the tests:
 > 
 >     torture/testdata/TD00001
 >                      TD00002
 >                      ...
 > 
 [...]
 > 
 > The meaning and contents of each data files is documented in a new
 > chapter of the Test Specification Document: "Test Data Files":
 > 
 > @table @file
 > @item TD00001
 > JBIG2 global segment for the page @file{TD00002}.
 > @item TD00002
 > JBIG2 page using global segment @file{TD00001}.
 > @item TD00003
 > Decoded JBIG2 page bitmap from page @file{TD00002} and global
 > segment @file{TD00001}.
 > @end table
 > 
 [...]
 > Also, unit tests using some data files should document it as in:
 > 
 > @deffn Test pdf_stm_read_010
 > Create a memory-based reading stream and attach a JBIG2 decoder to
 > it. Then decode a JBIG2 page using a global segment.
 > @table @strong
 > @item Success condition
 > The decoded bitmap should be equal to @file{TD00003}.
 > @item Data files
 > @itemize @minus
 > @item @file{TD00001}
 > @item @file{TD00002}
 > @item @file{TD00003}
 > @end itemize
 > @end table
 > @end deffn

Jemarch, if you agree I'm willing to automatize the unit testing documentation
process. The idea would be to follow two rules:

1. Define a format for the comments for each test procedure in the unit test
   files (the one used right now I guess is OK but I'm not sure if it's the
   one used for _every_ test file).
2. For each 'TDXXXXX' in 'torture/testdata' have a 'TDXXXXX.desc' text file with
   the test data description.

Keeping to those rules I can try to write a perl script that generates the Test
Specification Document automatically. That means we need to write less texinfo
code. :-D

cheers

-gerel


Reply via email to