Thanks Stephen and Jed, your description makes my concern much less rational. I keep reading about the various ways the Windows operating system is hacked because it is poorly written. Its good to hear that systems are being developed that don't have these problems and are written to be less sensitive to virus and other kinds of code changes.

Ed


On Jul 8, 2009, at 3:06 PM, Stephen A. Lawrence wrote:



Edmund Storms wrote:
This helps explain the situation, Stephen.  However, suppose I make
some neat changes in an open source program and add a few backdoors.
Then I send it to my friends, who use it and send it to their friends
because of the neat features I added.  Eventually, the code becomes
widespread. The backdoors would not be discovered unless someone who
knows the code and has time to check any changes finds them.  Why has
this not happened to Linux?

Open source doesn't normally work that way. I've never heard of anyone installing a random hacked-up version of Linux which they got from their
friends (or a random hacked up version of any other large open source
system).  Do you use random hacked-up versions of OpenOffice?  No, I'm
sure you don't, and nobody else does, either -- it's not just rare, it's
unheard of.

Aside from the fact that you have to be an idiot to install major system
components of unknown provenance, there's the fact that the major
organizations have hoards of elves maintaining the open source systems,
and custom versions fall out of date *very* fast unless the custom
changes are folded back into the mainline. A new version of Linux comes
out about every six months (for the major distributors) and updates to
components come out almost daily.  GoogleOS will probably have updates
coming out at the same breakneck pace if it ever gets off the ground.

Now, you're presumably actually just talking about a hacked up kernel,
rather than the whole umpteen gigabyte system. So, what kind of feature
can you imagine that your friend might add to a kernel that would
convince you to use *his* kernel rather than one blessed by Linus?
Personally I can't imagine such a feature -- I wouldn't trust a hacked
kernel, of course, but more to the point, when the next new kernel comes out, typically in about 30 days, what am I going to do about my friend's
patches?  Apply them to the new kernel myself?  Get a new kernel from
the friend to drop on top of the new one I just got from
Redhat/Yellowdog/Debian/whoever? It's a nightmare to go that road, and
nobody's going to do it -- they'll use the kernels provided for their
system, or one from an equally well known source.

Up above I said "you'd have to be an idiot" and I should explain that,
because I was thinking of a rather specific set of reasons why you'd
need to be unhumanly stupid to drop a random (unknown) kernel into your system. If you're Joe Sixpack you're not going to be dropping a random
hacked kernel into your system; you don't know how -- Joe Sixpack
doesn't know enough to put himself in the dangerous situation to start
with.  Joe's going to be using an off the shelf kernel, with updates
provided by the vendor who supports his Linux (or GoogleOS) variant.
On the other hand, if you *do* have the expertise to replace individual
system components with nonstandard versions, then you're also going to
be aware of the danger of using an unknown kernel.  So, it's only the
person who knows how, knows better, and does it anyway who could
possibly get burned here -- and, as I said, if you really knew you
shouldn't do that, then you're an idiot if you do it anyway, and people
in general are NOT IDIOTS.  In short, the ignorant ones won't do it
because they can't, and the educated ones won't do it because they know
better.

Now, let's get back to the issue you hinted at, which is that nobody'll
have time to track down all the possible security holes introduced by
random hackers.  I don't know how familiar you are with large open
source projects, but they are *not* run like Wikipedia. To get a patch
folded back into the Linux mainline, you have to get it past Linus (I
mean that literally -- last I heard Linus Thorvalds was still vetting
everything that went into the kernel). And if you want it folded into, say, Redhat's custom version of the kernel, you need to get it past the
people doing code reviews at Redhat.  They don't just look at the nice
cover letter you wrote and say, "Oh this sounds like fun, let's stick in the next release". They actually look at the code, too, and when it's a
patch from an unknown outsider, you better believe they look at it
pretty carefully!

In fact, the only people you really have to fear anything along these
lines from are the INHOUSE developers at the OS maintainers main
office.  And those people exist at Microsoft, too --- consider
particularly the "mouse mess" at Microsoft a while back:

http://www.grc.com/wmf/wmf.htm

Finally, open source OS code is likely to be *better* *vetted* than
closed source code. It's not clear the "mouse mess" could have remained
hidden in Linux for nearly so long as it was in Microsoft's OS -- it
lurked in there for years before someone noticed it, and Microsoft was
slow to admit there was a problem or do anything about it after someone
found out, which resulted in a huge number of instances of exploits.
Part of the reason it can be so hard to do anything about problems like this in a closed source system, of course, is that almost nobody gets to
look at the code, so the pool of potential whistle blowers is very
restricted.

For a second example, google "rootkit" and "sony" to find out just how
badly you can get nailed when you're dealing with closed source.  Once
again, the ones who were playing fast and loose with the internals were
not hackers (they can't, they're not in a position to do that).  They
were the inhouse programmers at Sony, working with full access to the
(secret) source.  And the ones who got nailed were the general public,
duffer and expert alike, who are not allowed to see the secret sources,
and so can't know what's actually running on their systems...




Ed


On Jul 8, 2009, at 1:24 PM, Stephen A. Lawrence wrote:



Edmund Storms wrote:
They say this is an open system, which has the advantage of putting
the user in control. Why would it not also put the hacker in control?
What's the problem with open source, aside from the fact that anyone can
learn how the system works?  I don't see one.

Security based on secrecy doesn't work very well -- one leak and you're dead meat -- so opening the source is not in itself a problem. In fact
it's widely felt that voting machine software, to name one example,
would be far more secure if it were entirely open.

Secret "backdoors" are secure as long as they're secret, but they're
generally considered totally unsecure, because they don't stay secret.

The only thing opening the source does is, it makes it impossible for the vendor to retain the capability to prevent improvements. The user
chooses the software to put on their machine, and they'll choose the
version from Google *unless* there's a version which is better (or
equally good and cheaper). With a closed-source system, on the other
hand, you can drop the "*unless*" clause: there is only one version
available.  And that's the only real difference.

Finally, as an observation on who this helps and who it hurts, my guess is it's going to end up hurting the consumers most of all. Google is a company driven *entirely* by ad revenue AFAIK and one of their primary missions seems to be to make ad delivery (and content delivery) secure and reliable for the advertisers and content vendors. They are squarely on the opposite side of the fence from FSF. Check out Chrome, and think
about these questions:
What's Chrome got?  Lovely UI.
What's it missing?  Cookie control!!
You get better tracking cookie control with IE than you do with Chrome!
Unless Google has changed this, the concept of arbitrarily limiting
cookie lifetimes to the life of the session (with a list of exceptions)
is completely missing from Chrome.  I believe there were some other
cookie control issues as well, but that was the big one, which really stood out for me: Use Chrome, be tracked, it's as simple as that -- and
the old argument that they can't match up the cookies with *you* is
either already false or certainly likely to be false in the future.

If Google can push something on consumers which "frees" them from
Microsoft while simultaneously "freeing" the vendors from the nasty
cookie controls of Firefox they'll view it as a home run, I'm sure.


Ed
On Jul 8, 2009, at 12:49 PM, OrionWorks wrote:

I've labeled this thread "OT" because the subject would seem to be
unrelated to the issues concerning the occasionally scrappy process of
developing alternative energy strategies.

But then... maybe it does bare some semblance:

http://www.cnn.com/2009/TECH/07/08/google.chrome.os/index.html

Regards
Steven Vincent Johnson
www.OrionWorks.com
www.zazzle.com/orionworks






Reply via email to