On Sun, Sep 30, 2018 at 4:30 AM Paul Moore <p.f.mo...@gmail.com> wrote:
>
> On Sun, 30 Sep 2018 at 11:48, Nathaniel Smith <n...@pobox.com> wrote:
> >
> > Now that the basic wheels/pip/PyPI infrastructure is mostly
> > functional, there's been a lot of interest in improving higher-level
> > project workflow.
> [...]
> > This is very much a draft, intended as a seed for discussion, not a 
> > conclusion.
> [...]
>
> The problem with high-level management tools for workflow (and
> especially opinionated ones) is that unless you're very careful to
> survey people's requirements and specify your scope, you're always
> going to end up with people who need to do certain things *not* being
> served by your tool. So it's almost impossible to be "the one official
> tool".

Nathaniel, thanks for starting this discussion. I like how you're
stepping back and questioning old assumptions, etc.

I share Paul's concern a bit re: "one tool." As soon as a hypothetical
tool is released, it becomes saddled with backwards compatibility
guarantees, which prevents things from being fixed as you learn more.
This is related to Guido's(?) saying about how putting something in
the standard library is like putting one of its feet in the grave (the
elephant's foot?).

Some questions related to your ideas: is the "elephant" one tool, or
more abstractly one set of specifications, or simply a recommended
workflow (e.g. for the 80%)? I think it would be good if a tool and /
or specifications are flexible enough so they can be adapted to use
cases that we might not have written down. Or is the tool explicitly
not trying to be useful in all use cases? Before, I thought PyPA's
approach was to slowly create more standards  (e.g. a standard for the
leg, the trunk, etc) which would let others create and innovate a
multitude of tools, as opposed to a top-down approach of thinking of
one tool first. Is that standards approach not working out, or is this
just something to start doing in parallel to supplement that?

To give you an idea, here's one example of a trickier workflow / use
case I've found. Say you're developing locally an application that you
run inside a number of Docker containers (even when developing), and
you want to be able to sync your code changes in realtime into Docker
while the application is running. Also, you might get most of your
dependencies from PyPI, but occasionally, you also want to swap in
forks of dependencies, that you can similarly edit while developing
(e.g. like editable installs). It can be challenging to get stuff like
this working if the tools you're using make too many directory or
workflow assumptions. However, a very powerful or flexible tool (e.g.
Git), or a collection of several tools that each does one thing well,
can often work well in unanticipated situations. (However, neither of
those options strikes me as being friendly to beginners, which might
be the primary thing you're trying to solve -- I'm not sure.)

--Chris
--
Distutils-SIG mailing list -- distutils-sig@python.org
To unsubscribe send an email to distutils-sig-le...@python.org
https://mail.python.org/mm3/mailman3/lists/distutils-sig.python.org/
Message archived at 
https://mail.python.org/mm3/archives/list/distutils-sig@python.org/message/I76YBVO3CIF2TKVV3RMJ4V6LTLEAG4XI/

Reply via email to