At the risk of having Theo tell me to shut up and get back to work on
things that matter, ...

On Fri, Jan 2, 2015 at 1:33 AM, FRIGN <[email protected]> wrote:
> On Fri, 12 Dec 2014 20:02:33 +0100
> Ingo Schwarze <[email protected]> wrote:
>
>> There are dragons.
>
>> If this scares anybody, i'm not surprised; updating libraries is
>> not a playground for newbies.
>
>> That, actually, is *terrible* advice and almost guarantees a fiasco.
>> If you edit shlib_version manually, you build a library containing
>> code *incompatible* with what it's supposed to contain, so the end
>> result will be that programs using that library will, at run time,
>>  * crash,
>>  * produce obviously wrong results,
>>  * and/or silently produce results that are wrong in non-obvious ways.
>> Some programs, by mere luck, may also work, if you are lucky,
>> but it's hard to predict in advance which ones will and which
>> ones wont.
>
>> To update a library, update all related source code - ideally,
>> the whole source tree unless you know precisely what you are
>> doing - to one consistent state, than compile from that state
>> *without* manually screwing with shlib_version.  That's the whole
>> point of library versioning!
>
> I may get a little off-topic here and object for this very important
> topic to be discussed in a separate thread some day, but come to
> think of it, the overhead induced with dynamic linking, symbol
> versioning and crazy workarounds to make the dynamic linker remotely
> safe nowadays completely destroy the once good reasons for dynamic
> libraries.

Do I hear echos of the /usr argument? Speed and large drives "solve"
this one problem, so we can throw sense and reason out the window?

> Static linking eliminates all the issues involved with symbol versions,

If only we had perfect libraries to start with.

> wrecking your system with a library update and I'd even go as far
> as saying that static binaries are more or less independent from
> distributions.

As long as you ignore the inherent implicit linkages of the context
OSses and the tools being used at a certain point in time, sure,
theoretically, distribution independence can happen. Probably in the
same universe where openbsd is a Linux distribution. (But, then you
must remember to never argue that Linux Is Not UniX!)

> Memory usage is also not a point any more, because we
> have shared tables nowadays and in many cases, statically linked
> programs need less RAM than dynamically linked ones.

Even if your arguments were not random, would they be meaningful?

You not only fail to prove your assertions, you fail to explain their
relevance. Why?

> So, what are the remaining arguments against static linking?

What is your underlying argument? Static vs. dynamic is a null
argument, so you must be using it as noise cover to sell something
really noxious.

> I agree there are programs which do not run well if statically linked
> (the X-server for instance), but that's more or less a matter of design
> and can be worked around in most cases.

???

> One specific point often raised is the argument, that if you have
> an update for a specific library (a security update for instance),
> you just need to recompile the library and all programs depending
> on the library will behave correctly.
> With static libraries on the other hand, you would have to recompile
> all binaries and superset-libraries depending on this library for
> the security fix to be effective.
> This point is increasingly losing significance due to the following
> reasons:

Riding your noise waves, to look for clues, let's see:

> 1) hot-swapping a library for a security-fix implies that the ABI
> doesn't change, i.e. that the binaries on your system using this
> library can access the functions the way they have been told
> where they can find them.

So, you are saying that there are so many bugs in the APIs that it is
useless to fix non-API bugs? Cool. We can all go home now.

> In many cases nowadays, bugs are fixed concurrently with version
> bumps (major & minor) which means that all binaries have to be
> manually updated and recompiled anyway.

What do you mean by manually, anyway?

> 2) compiling is not expensive any more (in most cases).
> On my Gentoo-based system, it just takes 2 hours to recompile the
> entire operating system including all user-space applications.

Gee, I wish I had a 512 core 14GHz Intel Core 10 with 16 Terabytes of RAM ...

... and the requisite nuclear reactor to power it.

> Moore's law will decrease this time over the years significantly.

Intel's going to start shrinking silicon atoms now?

> Imagine if it just took 5 minutes,

as opposed to 5 seconds?

Why not? We're in the imaginary world anyway.

> would there still be a reason to
> have a hand-crafted

Hand crafted, now? What does that mean?

> dynamic linker to carefully dissect libraries
> and binaries, imposing a run-time loss and lots of
> security-considerations?

Oh, I'm the boogie-woogie man!

(The Nightmare before Christmas is actually a rather deep movie.)

> I'm not talking about beasts like libreoffice, chromium and others.
> There are better alternatives around and if not, there will be in the
> future.

[1] We do not accept responsibility for changes in busines plans and
other artifacts of the future meeting reality.

> For huge packages, it should be simple enough to design the
> package-manager

There's one. I'm not the only one who noticed the One Package Manager
being snuck in.

But, of course, it's a McGuffin. So let's keep looking.

> in a way serving static binaries, and in case there is
> a library-fix, tell all clients to redownload the current version
> again.

Oh. There's another one. Automated traffic makes a great
steganographic medium, doesn't it?

> So the only real worry here is to have a clean build-
> environment on the build-servers (designed by experts)

Oh, yeah, experts and centralization. That's three.

> and not wasting
> hundreds of man-hours designing systems to cope with the dll-hell
> almost all Un*xes have become on the client-side.

Finally, we get close to the core argument: "The problems you are
having our not our fault, but our snake oil is your solution!"

Not your fault at all, is it?

> Why is Linux/BSD

There you go again.

> not popular on the desktop?

And again.

> Because of fragmentation.

And repeat the McGuffin.

> And one reason for fragmentation is that you can't use Debian packages
> in Ubuntu, mostly because there are library incompatibilities.
> Other reasons are lack of good software, but that's just a matter of
> time. And if we can get more developers to work on useful stuff instead
> of having to worry about library-versioning, this goal could be reached
> in shorter time.

"This goal?"

And what exactly is this goal? I mean the real, hidden goal?

> It may be a little far-fetched, but I'm sure it would be possible
> to have one package-manager for all distributions if there would just
> be the motivation to distribute statically linked binaries and not fuck
> things up with distribution-specific folder-structures.

Sure, uniform directory structures makes the NSA's job easier, doesn't
it? (Makes all the attacks easier, but we'll ignore that for now.)

> 3) security

"Won't somebody think of the _____!?!?"

> Well, the issues with dynamic linking have been stated often enough[0]
> [1][2][3][4].

Where did this bad habit of explaining yourself out-of-band come from?

> As far as I understand, the initial motivation of the OpenBSD-project
> was to favor security over speed.

I wonder if you are really operating from that assumption.

No, I think you know that's a misrepresentation.

> It just puzzles me that issues like
> dynamic linking have not yet been discussed broadly or dealt with in
> the last few years given these obvious negative implications.
>
> Please let me know what you think.

I suppose it's inevitible that someone will want to control the world,
but I really wish you and your friends would quit lying to yourselves
about power.

Computers are not jinn to do your bidding, and they are not magic
boxes to solve your problems trying to control your universe.

Tools. All they are is tools.

Like all tools, they can amplify your efforts a bit, but amplification
doesn't make the problems go away.

Use them as a tool to help you think, and maybe you can really solve
your own problems instead of trying to use them to force other people
to solve their problems your way.

> Cheers
>
> FRIGN
>
> [0]: http://www.catonmat.net/blog/ldd-arbitrary-code-execution/
> [1]: http://benpfaff.org/papers/asrandom.pdf
> [2]: 
> http://web.archive.org/web/20120509105723/http://teddziuba.com/2008/09/a-web-os-are-you-dense.html
> [3]: https://www.nth-dimension.org.uk/pub/BTL.pdf
> [4]: http://harmful.cat-v.org/software/dynamic-linking/versioned-symbols
>
> --
> FRIGN <[email protected]>
>

-- 
Joel Rees

Be careful when you look at conspiracy.
Look first in your own heart,
and ask yourself if you are not your own worst enemy.
Arm yourself with knowledge of yourself, as well.

Reply via email to