Hi Stefan, On Wed, 26 Apr 2017, Stefan Beller wrote:
> On Wed, Apr 26, 2017 at 1:19 PM, Johannes Schindelin > <johannes.schinde...@gmx.de> wrote: > > I recently registered the git-for-windows fork with Coverity to ensure > > that even the Windows-specific patches get some static analysis love. > > YAY! How do you trigger the coverity scan? For starters, I did everything manually. But I already have the grand plan to use VSTS build (I do not want to advertise this system too blatantly, but I have to say that I really love working with that system, it is so much more powerful than Travis or AppVeyor, Jenkins comes closer but not by far). As you may have glimpsed in one of my mails, Git for Windows needs a very special and extensive build setup. I call it the "Git for Windows SDK", and what it is, in a nutshell, is a custom MSYS2 installation that has all the packages installed that are needed to build Git for Windows, its most important dependencies, and to bundle everything in an installer. Think of MSYS2 as something like Homebrew. But cozier, as it really lives inside its own directory tree, its /etc/ is not really visible to any other software, for example. This makes Continuous Testing a bit of a challenge, as you cannot hammer the package repository of MSYS2 with tons of fresh full installations all the time (that package repository is run by a volunteer, and it is sometimes unreachable, too). So after talking to a GitHub guy at the Developer Summit during GitMerge (Patrick asked a couple of questions and then said my plan was okay), I put two new repositories up that host the initialized Git for Windows SDKs (one for 32-bit, one for 64-bit). And of course there is a job to keep them up-to-date, run once every 24h. Almost all of the VSTS builds I maintain clone or update one or both of these repositories. And so will also the Coverity job. I still have to find the time to figure out one more detail: how to download and extract the Coverity tool (the .zip archive has a variable name for the top-level directory), and doing that only every once in a while, say, only when there is no previously unpacked tool, or it is already 4 weeks old. And then I should probably try to figure out a way how to delay subsequent runs of that job so that we run it at most once every twenty-four hours. Ciao, Dscho