On Wed, 24 Feb 2021 at 13:12, Antoine Pitrou <anto...@python.org> wrote:
>
> On Wed, 24 Feb 2021 13:47:40 +0100
> Stéfane Fermigier <s...@fermigier.com> wrote:
> > The 3rd solution is probably the best of the 3, but the sharing mechanism
> > still needs to be specified (and, if needed, implemented) properly.
>
> I wouldn't want to repeat myself too often, but conda and conda-based
> distributions already have sharing through hardlinks (or, on Windows,
> whatever is available) baked-in, assuming you install your software
> from conda packages.
>
> That also applies to non-Python packages, and to python itself (which
> is just a package like any other).

I'm not sure conda solves the problem of *application* distribution,
though, so I think it's addressing a different problem. Specifically,
I don't think conda addresses the use case pipx is designed for.

Although to be fair, this conversation has drifted *way* off the
original topic. Going back to that, my view is that Python does not
have a good solution to the "write your application in Python, and
then distribute it" scenario. Shipping just the app to be run on an
independently installed runtime results in the conflicting
dependencies issue. Shipping the app with bundled dependencies is
clumsy, mostly because no-one has developed tools to make it easier.
It also misses opportunities for sharing libraries (reduced
maintenance, less disk usage...). Shipping the app with a bundled
interpreter and libraries is safest, but hard to do and even more
expensive than the "bundled libraries" approach.

I'd love to see better tools for this, but the community preferred
approach seems to be "ship your app as a PyPI package with a console
entry point" and that's the approach pipx supports.

I don't use Linux much, and I'm definitely not familiar with Linux
distribution tools, but from what I can gather Linux distributions
have made the choices:

1. Write key operating system utilities in Python.
2. Share the Python interpreter and libraries.
3. Expose that Python interpreter as the *user's* default Python.

IMO, the mistake is (3) - because the user wants to install Python
packages, and not all packages are bundled by the distribution (or if
they are, they aren't updated quickly enough for the user), users want
to be able to install packages using Python tools. That risks
introducing unexpected library versions and/or conflicts, which breaks
the OS utilities, which expect their requirements to be respected
(that's what the OS packaging tools do).

Hindsight is way too easy here, but if distros had a "system Python"
package that OS tools depend on, and which is reserved for *only* OS
tools, and a "user Python" package that users could write their code
against, we'd probably have had far fewer issues (and much less FUD
about the "using sudo pip breaks your OS" advice). But it's likely way
too late to argue for such a sweeping change.

*Shrug* I'm not the person to ask here. My view is that I avoid using
Python on Linux, because it's *way* too hard. I find it so much easier
to work on Windows, where I can install Python easily for myself, and
I don't have to fight with system package managers, or
distribution-patched tools that don't work the way I expect. And
honestly, on Windows, there's no "neglect of the system environment"
to worry about - if you want to install Python, and use pip to install
packages into that environment for shared use, it works fine. People
(including me) use virtual environments for *convenience* on Windows,
not because it's a requirement.

Paul
_______________________________________________
Python-Dev mailing list -- python-dev@python.org
To unsubscribe send an email to python-dev-le...@python.org
https://mail.python.org/mailman3/lists/python-dev.python.org/
Message archived at 
https://mail.python.org/archives/list/python-dev@python.org/message/J7U6525AZAW4P5ZYH5WLK5IDF6TCH73O/
Code of Conduct: http://python.org/psf/codeofconduct/

Reply via email to