>> Sean 'Shaleh' Perry <[EMAIL PROTECTED]> writes: > or do a staging in experimental or somewhere else. Upload everything > there, let people look at it for a day or two then move it over.
That's the way I interpreted this, too. It's insane to try to NMU 1000 packages in one day. My one problem with this "solution" is that it will break people's system is a very visible way. For example, at the group were I work, 90% of our development is done in C++. Once this plan is carried out, an upgrade would break almost all our binaries in one fell sweep. Keeping the old libraries in a special directory would alleviate the problem for us. After some grep-dctrl hackery it looks like we have something like 250 *library* packages which are affected (this number is probably upwards biased). This would represent a 2% increase in the number of packages (1 GB increase in the archive size? 400 kB average size for a library package? Sounds ok, we have some pretty large library packages, but it's probably less). But the non-transparency of that solution makes it unattractive. Hacking the dynamic linker isn't sexy, but sounds doable. Besides the ugliness factor, is there anything that speaks against this? Is there an alternative that *doesn't* involve local recompiles? -- Marcelo | The duke had a mind that ticked like a clock and, like [EMAIL PROTECTED] | a clock, it regularly went cuckoo. | -- (Terry Pratchett, Wyrd Sisters)