Hi.  The advice here is from a perspective of someone who does this
professionally, for large, highly loaded systems.  This doesn't
necessarily apply to your case / not to the full extent.

> Debian (or even Python3 itself) doesn't allow to pip install required 
> packages system wide, so I have to use virtual environments even there. But 
> is it right, that I have to do that for every single user?

1. Yes, you can install packages system-wide with pip, but you don't need to.

2. pip is OK to install requirements once, to figure out what they are
(in dev. environment).  It's bad for production environment: it's
slow, inconsistent, and insecure. For more context: pip dependency
resolution is especially slow when installing local interdependent
packages. Sometimes it can take up to a minute per package.
Inconsistency comes from pip not using package checksums and
signatures (by default): so, if the package being installed was
updated w/o version update, to pip it's going to be the same package.
Not just that, for some packages pip has to resort to building them
from source, in which case nobody can guarantee the end result.
Insecurity comes from Python allowing out-of-index package downloads
during install.  You can distribute your package through PyPI, while
its dependency will point to a random Web site in a country with very
permissive laws (and, essentially, just put malware on your computer).
It's impossible to properly audit such situations because the outside
Web site doesn't have to provide any security guarantees.


To package anything Linux-related, use the packaging mechanism
provided by the flavor of Linux you are using.  In the case of Debian,
use DEB. Don't use virtual environments for this (it's possible to
roll the entire virtual environment into a DEB package, but that's a
bad idea). The reason to do this is so that your package plays nice
with other Python packages available as DEB packages. This will allow
your users to use a consistent interface when dealing with installing
packages, and to avoid situation when an out-of-bound tool installed
something in the same path where dpkg will try to install the same
files, but coming from a legitimate package.  If you package the whole
virtual environment, you might run into problems with locating native
libraries linked from Python native modules.  You will make it hard to
audit the installation, especially when it comes to certificates, TLS
etc. stuff that, preferably, should be handled in a centralized way by
the OS.

Of course, countless times I've seen developers do the exact opposite
of what I'm suggesting here. Also, the big actors in the industry s.a.
Microsoft and Amazon do the exact opposite of what I suggest. I have
no problem acknowledging this and still maintaining that they are
wrong and I'm right :) But, you don't have to trust me!
-- 
https://mail.python.org/mailman/listinfo/python-list

Reply via email to