The frustrating thing about this problem is that it is a problem that seems to be fundamentally *solved*. PLaneT didn’t do it quite right, it’s true, but the current package system doesn’t, either. In fact, in a number of ways, the new system is catastrophically worse than the old one (it doesn’t scale).
I’ve ranted about this plenty already, but most people seem unconvinced about why this is such a massive problem. I maintain that it is. With a little bit more knowledge on this topic, let me express precisely why the current system does not work. In this conversation, two people touched upon points that I think are important. First, from Neil: > My super-strongly preferred engineering notion: backward-compatibility of a > package refers to the *documented* behavior of the package, not to actual > behavior. Second, from Jack: > The biggest problem thus far with this workflow is that if a package `foo` > says "I require version 2.0 of package X", the package catalog can only give > you the most recent version of package X. This isn't a major issue if package > X is well-behaved regarding backwards compatibility, but as these are social > rules and not technical ones, outliers will exist. These are both incredibly flawed. They might work fine in a tiny little academic environment, but in the real world, this is almost outlandish for a few reasons: 1. Things that are “just bugfixes” will inevitably break things every once in a while. 2. If a library has a bug, sometimes the only way to deal with it is to hack around the bug. When the bug is fixed, the hack might break. Broken code! 3. Asking people to make a new package for every breaking change they introduce (1) introduces a large barrier to package developers and (2) will lead to a massively cluttered set of packages with no semantic clarity about the differences. Packages foo12 and foo13 might have tiny incompatibilities, but foo13 and foo14 might be whole rewrites. This can be solved with tooling, fortunately. However, it does mean that, under the current model, every new version of a package will need to be a distinct “package” under the current system’s definition of “package”. So what’s the right way to do it? 1. Use semantic versioning to version packages and to resolve dependencies. This works, but it still has the social problem, so… 2. Introduce a dependency “lockfile” the way Ruby’s Bundler tool does. This means that dependencies won’t change unless a user explicitly updates them, but updating to a new version is still painless because semantic versioning makes it clear. 3. Make this the only user-facing interface. Either scrap the current system or make it an internal component of a larger system. I’ve actually taken some small stabs at implementing something like this, but it’s a nontrivial project, especially since the current package server’s API seems to be undocumented (I’m referring to the API the JS uses, not the interface raco uses). Plus I’d be on my own, and so far people don’t seem to care much about fixing this. Perhaps I’m wrong and it’s not as big of a problem as I think. Alexis (As an aside, the ability to introduce breaking changes into an API without fear of breaking everyone’s code is incredibly powerful for a package maintainer. It’s mostly why the iteration speed in JS-land can be so blindingly fast, but everyone still hangs together. There are problems with that example, specifically, but working on Racket packages feels like walking on eggshells in comparison.) -- You received this message because you are subscribed to the Google Groups "Racket Developers" group. To unsubscribe from this group and stop receiving emails from it, send an email to racket-dev+unsubscr...@googlegroups.com. To post to this group, send email to racket-dev@googlegroups.com. To view this discussion on the web visit https://groups.google.com/d/msgid/racket-dev/60656584-93BE-412D-B46E-64A26943B78E%40gmail.com. For more options, visit https://groups.google.com/d/optout.