On Sun, 1 Dec 2019 01:33:50 -0800 (PST) John Ladasky <john_lada...@sbcglobal.net> wrote:
> The only thing I must install with pip is tensorflow-gpu. For > everything else, I make use of the Ubuntu repositories. The Synaptic > package manager installs packages (including Python modules) for all > user accounts at the same time, which I like. > > When I installed tensorflow-gpu using pip, I was in fact frustrated > because I couldn't figure out how to deploy it across multiple user > accounts at one time. I ended up installing it three times, once in > each account. You're suggesting that's actually preferred, at least > when pip is performing the installation. OK, I will endure the > repetition. You can set up a system-wide virtualenv (for instance in /usr/local/lib/myenv) and use pip install as root to set up everything into that. All the normal users have to do then is prepend /usr/local/lib/myenv/bin to their PATH. After that, you have a system-wide consistent distribution of all your needed Python packages. You can then uninstall all python packages provided by the Linux distro which you don't need. At the moment it seems as if all you need to install locally with pip is tensorflow-gpu. This will change once some future version of tensorflow-gpu depends on newer versions of the system-provided packges. When that happens, pip will pull all those packages into the user's local venv, and it will have to do that individually for each user. BTW, it took me a long time to embrace Python's "virtualenv" concept because I had a hard time figuring out what it was and how it worked. Turns out that there is no magic involved, and that "virtual environment" is a misnomer. It is simply a full Python environment in a separate location on your system. Nothing virtual about it. -- https://mail.python.org/mailman/listinfo/python-list