Eike, This is fantastic feedback. Thanks!
I've responded to specific issues inline. TL;DR: I fixed a lot of these points. This should be closer to merge-ability. -- Patrick Reynolds Technical Leader Kitware, Inc. 919 869 8848 On Saturday, September 28, 2013 at 5:36 AM, Rolf Eike Beer wrote: > Am Freitag, 27. September 2013, 11:30:52 schrieb Patrick Reynolds: > > Hi folks, > > > > I just pushed a topic to the stage that adds support for coverage.py python > > coverage. It actually should work for anything the exports cobertura > > coverage.xml files. > > > > The topic is here: > > > > http://public.kitware.com/gitweb?p=stage/cmake.git;a=shortlog;h=refs/heads/A > > dd-coverage.py-Coverage > > > > > You are doomed! Now that you want to introduce a new coverage handler I'll > complain as long until you add a testcase for it, i.e. putting a sample > output > file in the Tests/ directory and verify that CTest will generate the expected > results ;) > > > Challenge accepted! The test is added and I pushed the updated branch: http://public.kitware.com/gitweb?p=stage/cmake.git;a=shortlog;h=refs/heads/Add-coverage.py-Coverage > > But there are some things about your code and even about older code. First, > you do not set error from cont.error in > cmCTestCoverageHandler::ProcessHandler(). > > > Fixed on the aforementioned branch. > > Second, this whole gathering code there will badly fail once 2 coverage > systems are present and the second one gives an error. Say gcov finds 2 > files. > Then python has an error, returns -1. The file count is 1, no error is > propagated. This is not your fault, but please fix this in one commit first > before adding your new handler. And maybe make this block from gcov to mumps > somehow use an array, vector, list, or something and just iterate over it, so > you only have to add your handler into the array. > > > Hmmm. I agree with you on this one; however, I think this may be beyond the scope of my contribution. If it's okay with the community, I would like to add a bug in Mantis for this overall and treat it as a separate topic. > > Then I'm very unhappy with the coverage database being in the source > directory. One setup I daily use at work is having one checkout and run 2 > builds from it with different compilers. In your case the coverage results > will > mix. Even worse, they will always sum up, because usually noone deletes a > file > from the source directory before running a test. > > > Fixed on the aforementioned branch, again. :) > > Another important question: can python handle the situation where 2 test > programs run in parallel and want to write to the coverage file? Or will they > just happily trash each others results? That's a problem I fixed in 2.8.12 > which would have happened for all other coverage generators. > > > I have not verified this experimentally, but coverage.py claims to support this through there '-p', parallel option. > > Another thing totally unrelated to your code I see when looking at the code > is > that the Bullseye coverage handling seems broken: > > int cmCTestCoverageHandler::HandleBullseyeCoverage( > cmCTestCoverageHandlerContainer* cont) > { > const char* covfile = cmSystemTools::GetEnv("COVFILE"); > if(!covfile || strlen(covfile) == 0) > { > cmCTestLog(this->CTest, HANDLER_VERBOSE_OUTPUT, > " COVFILE environment variable not found, not running " > " bullseye\n"); > return 0; > > /// nothing happened, return 0 > > } > cmCTestLog(this->CTest, HANDLER_VERBOSE_OUTPUT, > " run covsrc with COVFILE=[" > << covfile > << "]" << std::endl); > if(!this->RunBullseyeSourceSummary(cont)) > { > cmCTestLog(this->CTest, ERROR_MESSAGE, > "Error running bullseye summary.\n"); > > > //// error case, return 0 > > return 0; > } > cmCTestLog(this->CTest, DEBUG, "HandleBullseyeCoverage return 1 " > << std::endl); > > //// default, probably everything fine: return 1 > > return 1; > } > > > if(this->HandleBullseyeCoverage(&cont)) > { > return cont.Error; > } > > Break if everything is fine, continue on error or when no files are found? > All > other handlers support being run in parallel, why not Bullseye? > > > I'll leave this one to folks who are better versed in the arcane art of Bullseye... > > Eike > -- > > Powered by www.kitware.com (http://www.kitware.com) > > Visit other Kitware open-source projects at > http://www.kitware.com/opensource/opensource.html > > Please keep messages on-topic and check the CMake FAQ at: > http://www.cmake.org/Wiki/CMake_FAQ > > Follow this link to subscribe/unsubscribe: > http://public.kitware.com/cgi-bin/mailman/listinfo/cmake-developers > > >
-- Powered by www.kitware.com Visit other Kitware open-source projects at http://www.kitware.com/opensource/opensource.html Please keep messages on-topic and check the CMake FAQ at: http://www.cmake.org/Wiki/CMake_FAQ Follow this link to subscribe/unsubscribe: http://public.kitware.com/cgi-bin/mailman/listinfo/cmake-developers