Hi all, after the remaining problems (non-integer index, unavailable "dask" package, non-x86 test failures) were fixed or got a workaroound, a new test problem with nunmpy 1.12 appeared that made the build very unreliable (random >50 % failure rate). I am however convinced that this is a problem in the test framework, either in numpy (which introduced the regression with 1.12), or in skimage (parallel execution of tests that were not threadsafe) -- details are still unclear and discussed in
https://github.com/numpy/numpy/issues/8413 Since this is a problem with the test and not with the package itself, I temporarily disabled the tests completely, so that the package should be built everywhere, and hopefully migrate before the freeze. I still think that the cause for the trouble here is that numpy was uploaded as a beta version, with no warning beforehand, and with completely no information what was changed and where problems could be expected. Just throwing all betas, RC and whatever into unstable is IMO a very bad solution, which is likely to cause trouble, and I would like to ask if we could have something in our policy that describes how important packages (numpy, scipy, ...) should be updated: IMO prereleases should go to experimental, and be accompanied with an announcement in debian-science, containing API incompatibilities and possible other problems and asking to test and comment on this within a reasonable time frame. Only after that, it should be decided how to proceed with an upgrade. Having a problem like the one above just a few days before a freeze is really something one doesn't want to happen. Best regards, and oh yes: Merry Christmas to all! Ole

