On 1/08/2013 4:35 AM, memilanuk wrote:
Also... in some places in the 'Net I see references to installing
everything 'locally' via pip, etc. in virtualenvs and not touching the
system installed version of python... yet most linux distros seem to
have many/most such packages available in their package repos, which
seems like it'd be easier to install via the package manager and let it
keep things updated.  Could someone touch on what they feel the pros and
cons would be either way?

Generally, if your OS installs a version of Python by default you should leave it alone because the OS itself is dependent on it. Updating to newer versions of Python or installed libraries can introduce version conflict errors in system-level apps, which is a bad thing.

Similarly, using the system install & libraries ties you to those versions. This may not be an issue if you're just scripting a few helper tools for your system, but it's an unnecessary hinderance if you're developing independent applications.

Tools like virtualenv or zc.buildout provide a handy way of sandboxing the dependencies of individual applications. They let you build more than one app in parallel and not let the dependencies of one interfere with the others. Of equal importance is their use in deploying to other machines. With virtualenv, you can create a list of installed libraries with:

    pip freeze > requirements.txt

To ensure a target machine has all of the dependencies your application needs you can then do:

    pin install -r requirements.txt

So: for simple scripts, just go with the system install. For serious development work, I highly recommend using virtualenv or zc.buildout to contain each development environment.
--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to