Hello all,

I don't know if this suggestion is missing some point, or it's part of 
something already proposed before.

In a professional environment, we've came to a point in which most people use 
virtual environments or code environments to avoid "polluting a global 
environment".

However, I think that's a problem with the default behaviour of the module 
management in Python. A nice default behaviour would be to search for a 
requirements.txt file in the same directory as __file__, and use the newest 
version of every module that matches the constraints. If no requirements where 
given, the newest version already installed could be used. That would require a 
structure that allows multiple versions of the same module to be downloaded.

I already anticipate some problems: increased disk usage for people that are 
not using virtual environments, the possibility of breaking changes for scripts 
with no constraints over a module (so that if a different module download a 
newer version, both would be using it), and of course the hassle of a 
completely new default behaviour that would require a transition in many 
codebases. But still, I believe it would pay off in terms of time saved in 
environment installing and switching.

Also, I think it's a good step in the path to integrating pip as something 
closer to the Python core.

What's your opinion, is the effort required too big for the returns? Do you 
think other problems may arise?
-- 
https://mail.python.org/mailman/listinfo/python-list

Reply via email to