Lucas Nussbaum [2020-03-26 14:58 +0100]: > - package is uploaded > - package gets accepted in unstable > - package gets reviewed, a bug is filed > - bug gets fixed > > Except that with (B), we avoid the wait in NEW. > > One important question is: how often does the FTP team run into a > package that is so problematic that accepting it in Debian with an RC > bug is not an option?
At least during my many years of Ubuntu archive administration I've actually seen quite a lot of packages which contained non-distributable files, had hilariously broken maintainer scripts (which could then also damage *other* software on your system), and the like. For these an initial NEW review was quite important. That proposal is assuming that the "package gets reviewed, a bug is filed" step actually happens timely, but that is precisely the problem -- with such a workflow we would essentially stop having NEW review and just hope that someone catches bad packages before they get released. So IMHO this is not a solution, and only causes buggy packages to creep into unstable. However, as always in life this appears to be an 80/20 problem. A lot of new packages are small, simple, and harmless, and can be NEW-reviewed in minutes. E. g. a new python or Perl module, where the whole source code has a single license, very few authors, and no "funny" files. But these 80% tend to get stuck behind the large and complicated new packages. @ftpmasters, would it help to try some automation on the 80% case, and e. g. auto-process packages if they are lintian-clean, suspicious-source is empty, and checking for some reasonable overlap of licensecheck and grep -i '(c)' with names appearing in debian/copyright? So that ftpmasters can concentrate on the 20% complicated packages? Or are the 80% already not a problem/time sink? Thanks, Martin