On Tue, Feb 5, 2013 at 2:02 PM, Holger Krekel <[email protected]> wrote: > Dropping the crawling over external pages needs _much_ more than just a few > months deprecation warnings, rather years. There are many packages out > there, and it would break people's installations.
No it won't. Nothing gets uninstalled. What stops working is installing those packages with pip/easy_install. And that will start again as soon as the maintainer uploads the last version to PyPI, which she/he is likely to do quite quickly after people start complaining. > I certainly agree, though, that the current client-side crawling is a > nuisance and makes for unreliability of installation procedures. I think we > should move the crawling to the server side and cache packages. That will mean that a man in the middle-attack might poison PyPI's cache. I don't think that's a feasible path forward. Packages does not need to be "cached", as they are not supposed to change. If you change the package you should really release a new version. (Unless you made a mistake and discovered it before anyone actually downloaded it). So what you are proposing is really that PyPI downloads the package from an untrusted source, if the maintainer doesn't upload it. I prefer that we demand that the maintainer upload it. //Lennart _______________________________________________ Catalog-SIG mailing list [email protected] http://mail.python.org/mailman/listinfo/catalog-sig
