On July 24, 2014 at 4:55:42 AM, Richard Jones (r1chardj0...@gmail.com) wrote:
Thanks for responding, even from your sick bed.

This message about users having to view and understand /simple/ indexes is 
repeated many times. I didn't have to do that in the case of PIL. The tool told 
me "use --allow-external PIL to allow" and then when that failed it told me 
"use --allow-unverified PIL to allow". There was no needing to understand why, 
nor any reading of /simple/ indexes.
Currently most users (I'm thinking of people who install PIL once or twice) 
don't need to edit configuration files, and with a modification we could make 
the above process interactive. Those ~3000 packages that have internal and 
external packages would be slow, yes.
They need to do it to understand if a link is internal, external, or 
unverified. The feedback *i’ve* gotten is complete confusion about the 
difference between them. Even making that process interactive still means that 
pip cannot hard fail on a failure to retrieve an URL and thus must present 
confusing error messages in the case an URL is temporarily down.



This PEP proposes a potentially confusing break for both users and packagers. 
In particular, during the transition there will be packages which just 
disappear as far as users are concerned. In those cases users will indeed need 
to learn that there is a /simple/ page and they will need to view it in order 
to find the URL to add to their installation invocation in some manner. Even 
once install tools start supporting the new mechanism, users who lag (which as 
we all know are the vast majority) will run into this.
So we lengthen the transition time, gate it on an installer that has the 
automatic hinting becoming the dominant version. We can pretty easily see 
exactly what version of the tooling is being used to install stuff from PyPI.



On the devpi front: indeed it doesn't use the mirroring protocol because it is 
not a mirror. It is a caching proxy that uses the same protocols as the install 
tools to obtain, and then cache the files for install. Those files are then 
presented in a single index for the user to use. There is no need for 
multi-index support, even in the case of having multiple staging indexes. There 
is a need for devpi to be able to behave just like an installer without needing 
intervention, which I believe will be possible in this proposal as it can 
automatically add external indexes as it needs to.
Yes, devpi should be able to update itself to add the external indexes.



I talked to a number of people last night and I believe the package spoofing 
concept is also a vulnerability in the Linux multi-index model (where an 
external index provides an "updated release" of some core package like libssl 
on Linux, or perhaps requests in Python land). As I understand it, there is no 
protection against this. Happy to be told why I'm wrong, of course :)
It’s not really a “vulnerability”, tt’s something that is able to be done 
regardless and thus package authors are not part of the thread model. If I’m 
installing a package from a malicious author I’m executing arbitrary Python 
from them. They can drop a .egg-info into site-packages and spoof a package 
that way. It is completely impossible to remove the ability for a package 
author of a package that someone else is installing from spoofing another 
package. The spoofing problem is a red herring, it’s like saying that your 
browser vendor could get your bank password because you’re typing it into the 
browser. Well yes they could, but it’s a mandatory thing. If you’re installing 
a package a wrote you must extend trust to me.




      Richard




-- 
Donald Stufft
PGP: 0x6E3CBCE93372DCFA // 7C6B 7C5D 5E2B 6356 A926 F04F 6E3C BCE9 3372 DCFA
_______________________________________________
Distutils-SIG maillist  -  Distutils-SIG@python.org
https://mail.python.org/mailman/listinfo/distutils-sig

Reply via email to