On Thursday 06 March 2003 22:00, Paul Dorman wrote: > Or what about some kind of p2p solution? Where xxxx-light machines are > networked to and updated from other xxxx-light machines across the net? > Checksumming and other tools could be used to address security concerns.
You know, I almost took a job working for a company that thought the time for this had come a year and a half ago.... Maybe it is more doable now, at least for open source software (you don't have to worry about how to bill people, how to force users to stay online whenever possible, etc.), but there is still a major project, and there are problems that nobody's yet solved. On the one hand, an open source project can just use an existing protocol (say, gnutella) rather than building something new from scratch, and doesn't need to worry about billing, etc. And just distributing SHA URI's on official mirrors would be enough to search for the file online and verify that you've downloaded the right one (and of course RPM signatures provide security on top of that). But on the other hand, where does the network come from? If you build a new p2p network from scratch, you need to get people online. Most users won't be connected to the network except when they're in the middle of their own upgrade. If you use, say, the existing gnutella network, you have the advantage that every Mandrake user who's using gtkg, qtella, limewire, etc. (assuming they've added their package repository to their p2p upload directory list) is available--but the disadvantage that most of the people on the network don't have the files you want. Either way, you'll probably still need mirror sites--and I'm guessing it's much easier to find someone who will run ftp, rsync, and/or http mirrors than finding someone who will attach their mirror server to either a brand-new p2p network or the existing gnutella network.... > Oh, and I think that packages should be revertable on installed systems > as well. Users should be protected against unstable software wherever > possible, but at the same time they will demand the very latest releases. It would be nice to be able to downgrade through urpmi and the GUI tools (of course you can already downgrade today--just download and force-upgrade--but it's not as easy as installing or upgrading). If I try to downgrade kdebase, it would tell me "you also need to downgrade kdelibs and kdegames and uninstall kdevelop," and (if I approve) it would go get the relevant versions of kdebase, kdelibs, and kdegames and so on. I think that being able to deal with the same package groups as the installer when upgrading, installing, or downgrading would also be helpful. A beginning user knows that he installed "KDE Workstation," and wants to upgrade that, or that he skipped "LAN Filesharing" (or whatever that option is called) but now he wants it, but probably doesn't know what packages that involves. Maybe something like Microsoft's "restore points" in XP, but done right, would be useful as well. I mark a system restore point, then upgrade to the new version of Mandrake, install a bunch of new packages through rpmdrake, whatever; then, if it doesn't work, I just restore to the last point. Unfortunately, I think it would be even harder to get this right under linux than under XP. Anyway, I think that all of these ideas deserve looking into. Of course these kinds of suggestions always come up at the worst possible time, because that's when people think about them. Certainly you don't want anyone at Mandrake, or anyone who could be contributing to the 9.1 effort, putting much time into anything like this for the next few days. So, remember the ideas that are most important to you, wait until 9.1's out the door and everyone's had a little breathing time, then start a discussion when it's still months to go before the next freeze.