On Sat, Feb 1, 2014 at 3:23 AM, Vinay Sajip <[email protected]> wrote:

> On Fri, 31/1/14, Brian Wickman <[email protected]> wrote:
>
> > There are myriad other practical reasons.  Here are some:
>
> Thanks for taking the time to respond with the details - they are good
> data points
> to think about!
>
> > Lastly, there are social reasons. It's just hard to convince most
> engineers
> > to use things like pkg_resources or pkgutil to manipulate resources
> > when for them the status quo is just using __file__.  Bizarrely the
> social
> > challenges are just as hard as the abovementioned technical challenges.
>
> I agree it's bizarre, but sadly it's not surprising. People get used to
> certain ways
> of doing things, and a certain kind of collective myopia develops when it
> comes to looking at different ways of doing things. Having worked with
> fairly
> diverse systems in my time, ISTM that sections of the Python community have
> this myopia too. For example, the Java hatred and PEP 8 zealotry that you
> see
> here and there.
>

PEP 302 tried to unify this with get_data() and set_data() on loaders, but
prior to Python 3.3 you just didn't have any guarantee that __loader__
would even be set, let alone have a loader with those methods. Paul can
tell me if my hunch is off, but I assume the dream was that people would do
`__loader__.get_data('asset.txt')` instead of
`os.path.join(os.path.dirname(__file__), 'asset.txt')` to read something
bundled with your package. But as Brian has pointed out, people just have
habits at this point of assuming unpacked files and working off of __file__.

I know Daniel has said he wanted some concept of a listdir() on loaders so
that they could basically act as a virtual file system for packages, but it
would also require locking down what relative vs. absolute paths meant in
get_data/set_data (i.e. is it relative to the loader in the package or the
package itself? My pref is the latter for the case of reused loaders) and
really pushing people to use the loader APIs for reading intra-package
"files".


>
> One of the things that's puzzled me, for example, is why people think it's
> reasonable
> or even necessary to have copies of pip and setuptools in every virtual
> environment
> - often the same people who will tell you that your code isn't DRY enough!
> It's
> certainly not a technical requirement, yet one of the reasons why PEP 405
> venvs
> aren't that popular is that pip and setuptools aren't automatically put in
> there. It's a
> social issue - it's been decided that rather than exploring a technical
> approach to
> addressing any issue with installing into venvs, it's better to bundle pip
> and setuptools
> with Python 3.4, since that will seemingly be easier for people to swallow
> :-)
>

I suspect it's also ignorance and differences in deployment strategies.
Some people have such small deployments that hitting PyPI every time works,
for others like Brian it can't be more than a cp w/ an unzip. Maybe the
Packaging Users Guide could have a Recommended Deployment Strategies
section or something. I doubt there are that many common needs beyond "pull
from PyPI every time", "pull from your own wheel repo", "copy everything
over in a single wheel/zip you unpack" so that people at least have a
starting point to work from (especially if the instructions work on all
platforms, i.e. no symlink discussions).
_______________________________________________
Distutils-SIG maillist  -  [email protected]
https://mail.python.org/mailman/listinfo/distutils-sig

Reply via email to