On 5/3/19 3:13 PM, Smith, Barry F. wrote:


On May 3, 2019, at 3:57 PM, Scott Kruger <kru...@txcorp.com> wrote:



Sticking to the immediate issues and ignoring the other meta issues...

I think what you want could possibly be used to simplify the test harness if we 
push things down to the petscdiff level.  If we
have petscdiff detect the diff then it will automatically apply
the patches.  This would eliminate the "alt" files from the test
harness level.

   This could be fine. One could maybe even get away without using the patch 
tool but simply store the diffs that appear and compare the diff with the basic 
version against the stored diffs.

I'm not sure understand this.

I was thinking of something like:

tests/output
            /ex1.out
            /ex1-1.patch
            /ex1-2.patch

And then have petscdiff automaticall cycle through the patches (by patching into the local directory to avoid polluting the original repo).

The "update feature" of petscdiff shouldn't create patches, but it would be nice to have it automate the patch management in some way to try and make it a bit easier to develop tests).

Scott



Of course, petscdiff is in bash and we've talked about replacing
it with a python version.  Matt has said he has a preliminary version
and I'd appreciate being able to use this as a starting point.

Scott


On 5/2/19 3:59 PM, Smith, Barry F. wrote:
    Scott and PETSc folks,
      Using alt files for testing is painful. Whenever you add, for example, a 
new variable to be output in a viewer it changes the output files and you need 
to regenerate the alt files for all the test configurations. Even though the 
run behavior of the code hasn't changed.
     I'm looking for suggestions on how to handle this kind of alternative 
output in a nicer way (alternative output usually comes from different 
iterations counts due to different precision and often even different 
compilers).
     I idea I was thinking of was instead of having "alt" files we have "patch" 
files that continue just the patch to the original output file instead of a complete copy. Thus in 
some situations the patch file would still apply even if the original output file changed thus 
requiring much less manual work in updating alt files. Essentially the test harness would test 
against the output file, if that fails it would apply the first patch and compare again, try the 
second patch etc.
   Scott,
      What do you think? Should be an easy addition to the current model (no 
need to even remove the alt testing)? Would it also be possible to add a PATCH 
option to the test rule where it automatically added the new patch file? 
Perhaps all the patches for a test case could all be stored in the same file 
also so we don't need to manage patch_1.out patch_2.out etc? Each new patch 
would just get added to the file?
     Thoughts?
    Barry

--
Tech-X Corporation               kru...@txcorp.com
5621 Arapahoe Ave, Suite A       Phone: (720) 974-1841
Boulder, CO 80303                Fax:   (303) 448-7756


--
Tech-X Corporation               kru...@txcorp.com
5621 Arapahoe Ave, Suite A       Phone: (720) 974-1841
Boulder, CO 80303                Fax:   (303) 448-7756

Reply via email to