Jonathan Edwards wrote:

On Aug 25, 2009, at 10:00 PM, Brock Pytlik wrote:

no silly .. i mean the assumption that "dependencies are kept 'correctly' in package metadata" .. also assuming the notion that "correct dependencies" exist
Isn't this more or less arguing against the idea of packaging software at all?

nope .. it's a form of protection against the idea that package creators will *always* "do the correct thing"(tm) and that defined package dependencies are *always* absolutely correct regardless of what an administrator might know about the system he's trying to install .. nobody's saying that we should do away with dependencies .. just define a way to over-ride them if necessary.
As we've said, repeatedly, there are ways to deal with broken packaging.

Beyond tarball extraction, what does a packaging system provide except dependency management?

well let's see .. off the top of my head:
* bulk file tracking for install/uninstall
But if you don't trust the packager to get dependencies right, why would you trust them to group the files correctly? Why not just spat the exact set of files you need?
* fine tuning configuration files (ie: preinstall/postinstall)
Not touching this
* file modification tracking
Not sure what you mean by this?

* checksumming, discrepancy tracking
Check summing is a good one.
* patching, repairing, etc

We could provide a centralized location of tar balls with the same features auxiliary features (search, contents, info, etc...), and that wouldn't really be a packaging system.

why not? ever seen the slackware installer? .. not necessarily as feature rich as some of the others out there, but it can do the job .. tag a tar ball or cpio archive with some meta information, store the installation details in a database of some sort - and you could realistically do some dependency matching in an a posteriori fashion
I haven't seen the slackware installer. I guess we just have different desires for a packaging system because I don't know why that'd be a preferred solution. I kinda like installing a package and having it bring in the library it needs instead of having to discover each one that's needed in turn.

Files are included in the same package because there are dependency relationships between them that are needed to provide a set of functionality. I don't think you're suggesting that we should just provide users a interface of individual files to download and splat on their system. Why are the implicit dependencies (expressed by being members of the same package) any more or less valid than the explicit ones (expressed by the depend action)?

perhaps it might make more sense to separate associations from dependencies .. files are often included in the same package because they might be associated with a particular functional definition and form associations with the package definition .. i wouldn't necessarily say that "cp" is dependent on "tip" or that there is a dependency relationship between them, but i would say that both "cp" and "tip" are associated with what's been defined in the "core Solaris" package SUNWcs. To call them dependencies (whether required, optional, or incorporate) simply because they fall within the same package framework is a bit of a misnomer.

SUNWcs is not (personally) my example of what a package boundary should look like. Fine, call them "associations" instead of "dependencies" but if I remove the libraries delivered in SUNWfirefox, I'd bet firefox doesn't work well anymore. To me, that's a dependency.


pkgrecv --nodeps and re-publishing in a local repository is an idea and pretty simple to implement, but this doesn't necessarily help if you want to quickly remove something buried under 3 layers of dependencies and test your own dependencies.. agreed that this is typically done as part of a quick troubleshooting exercise (ie: remove this package, re-add this package) or developer exercise (eg: "let me make sure the system is forced to used my libraries instead the ones provided by this dependency") instead of a proper *modification of installed packages invalidates your support agreement* type of exercise .. but i've been in this boat a number of times, and i really dislike having to work around the installation tools to get the system to look like i'd want it to ..
You'll have to clarify why this solution doesn't help in this situation. If what you want to do is test whether your libraries are being used, and you're unwilling to publish your own package, why not just splat the files in place? That's essentially what --force or --no-deps will do any way? Why do you need the packaging system to do the "cp" commands for you?

in a word - so i can easily track them and back them out when i'm done .. setting up an entire repository to roll a whole distribution to work around a few broken dependency digraphs (note: broken as it pertains to the plans of the administrator) - seems a little like overkill in my opinion .. particularly if all i want to do is to temporarily clobber a known package a few layers deep (eg: zlib, openssl, etc)
What I, and I think others here, continue to be baffled by is why setting up a repository is such a painful thing to do. Why would you roll out a whole distribution, no one's ever proposed that. We've said, "republish the packages you don't like." Could we make the process simpler? Sure. Shawn's suggested several ways we might be willing to do that.

Btw, if all you want to do is put the files down, then back them out, zfs snapshots are wonderful things. It only provides one axis of change, but with some scripting love, I'm sure you could do exactly what you want without ever having to deal with the packaging system at all. I still don't get how that's different than -zforce, except it doesn't live in the packaging tool (because it breaks the notion of a well formed system).

Brock
[snip]



_______________________________________________
pkg-discuss mailing list
[email protected]
http://mail.opensolaris.org/mailman/listinfo/pkg-discuss

Reply via email to