Hi Raphael, hi everyone, [...] > > For a lack of time, I haven't yet looked into the GSoC proposals. Yet I am > > also > > facing the very same issues in my current research activities, where I'm > > trying > > to run our software verification tools on all the packages in Debian. > > There is > > still quite a bit of work ahead of me until I eventually get there, but at > > least > > automated builds using our own research compiler infrastructure are > > happening > > (resulting in some >100 bug reports..., usertagged goto-cc). > > I just took a look at some of them and they seem very useful. Are > those bug reports the result of manual verification? or more > precisely, has there been a need of manual verification to avoid > filing false positives or are all the issues detected known (or even > proved) to be true positives?
This point is indeed an extremely important one, at least as far as effort is concerned. My initial goal was along the lines of testing our own tools, and debugging any issues arising. Therefore manual inspection was required anyhow in most cases to figure out what the bug in our tools could be; yet after some local bugfixing most of the remaining build failures turned out to be problems of the packages to be built rather than our own tools. Yet I decided to remain in that mode of by default assuming bugs in our tools, and thus kept doing manual verification. This is a major threat to scalability, but at least provides better value to our fellow developers for the time being, as almost all bug reports will contain sufficient information for quickly devising a patch. I think it would be a serious (and interesting!) research endeavour to improve automatic diagnosis in the context of possible bugs found during compile-time and link-time static analyses. None of this is theoretically impossible, but it has big challenges to make it practical and actually useful. > And is it possible that such tool will eventually be publicly available? > Oh, actually all the tools are (with the exception of local patches still being tested). As mentioned in another part of the thread, the scripts are available from https://github.com/tautschnig/cprover-debian and the actual tools behind are part of the cbmc package. [...] > > - I'd like to investigate whether jenkins (with auto-generated jobs) could > > be > > used to take care of all the scheduling and triggering bits, plus > > notifications, etc. This would immediately come with the potential for > > scaling > > to build farms, as jenkins natively supports a master/slave setup. > > - I am currently working towards acquiring hardware via the university. Once > > I've got some (more), I'll play with a jenkins setup. If this succeeds, > > I'll > > happily share it. > > Would your student be focusing on jenkins only, or would she/he > explore alternative solutions first (at least as "related work") and > then focus on jenkins? He might look into jenkins, but this isn't even fixed yet. Any other suggestions are welcome, but given that time is very limited for such a final-year project over here, I don't expect more than 1 or 2 solutions to be experimented with for now (but I might try other options myself). > Yesterday I mentioned in #-irc that I don't see jenkins as a generic > solution for all the problems at hand. Especially when focusing on > more than just one tool; but that's just my impression, I haven't > actually tried to do it. > [...] Well, at least jenkins alone won't solve the problem, it will just be a tiny puzzle piece I believe. But I'm all ears to learn about other conceivable solutions or experience that people might have! Best, Michael
pgpFGs96F2rAF.pgp
Description: PGP signature
_______________________________________________ Daca-general mailing list [email protected] http://lists.alioth.debian.org/cgi-bin/mailman/listinfo/daca-general
