On 02/15/2011 11:32 AM, James Rhodes wrote:
You see, I would think that the distribution owners would actually
want to hand that work off to the person building or developing the
software in the first place, since the developer knows when they
release updates, they know what dependencies their software has and
hence they can maintain their packages in a much more timely and
secure manner than having a third-party distributor build and ship
them.
You think wrong. We would even like hardware vendor leave the development of drivers to the community (and provide the specs).
On Tue, Feb 15, 2011 at 9:11 PM, FlorianFesti<ffe...@redhat.com>  wrote:
Is such a packaged world the amount of data need for updating a
(compromised) library is enormous. This basically shuts down updates for
everything but the most urgent exploits and even they generate an ugly
amount fallout - especially as these updates come in one big chunk (think
about an exploit in zlib).
I understand your point here to be that the original software
developer may not have the bandwidth available to ship updates to all
the users of their library / application, right?  In that case, the
distributor could mirror the the AppTools package on their AppServer,
and the updating system would use that (it obviously prefers using the
distributor's service over an individual's server; and since it's a
mirror, there's no need for each package to be managed by the
distributor, it can be an automatic mirroring).
This issue goes beyond mirror bandwidth. Think about having 100 application packages each containing 20 libs and 200MB. Now consider that every lib has an critical bug every two years on average.

The distributions are a trusted third party that makes sure that the
software they get from upstream is not malicious. Sure vendors with a strong
brand don't need a third party (e.g. the adobe repositories). But the target
audience of such package formats typically don't have such a brand.
In order to ensure that updates you receive come from source that
originally produced the initial package you installed, you use
signing.  It's really that simple.
This is not what I am talking about. Read the paragraph above once again. With a signature you can verify the source but you don't know whether you can trust the source or not. This is not an issue for well known sources but there will be lot's of others.
The know how of good packaging and package maintenance does not scale down
very well. There is a serious amount of general knowledge and continuous
work needed. This is significantly easier within a big projects dedicated to
this task than on your own. No matter how good your tools are they are still
putting an pretty big burden onto the third party vendors (have a look at
the rpms they build).
I really don't understand what you're trying to say here.
I say most upstream and especially commercial software vendors do a pretty bad job with packaging right now and I don't see why one should expect them to become better.

I agree that the SUSE build system has come along way to making the
process easier, but in my opinion, it's still a workaround to a broken
system (and there's no such build option for third-party, closed
source software at this point).
Well, the current system of the distributions have shown to work well in a given environment. There are very little other packaging approaches that have shown to work at all (in the sense of gaining any notable market share) - non of them with a general scope. While I don't think it is impossible all approaches have failed to take all important requirements into account.

Florian
_______________________________________________
Distributions mailing list
Distributions@lists.freedesktop.org
http://lists.freedesktop.org/mailman/listinfo/distributions

Reply via email to