Michael Gorven wrote:
On Wednesday 19 August 2009 14:42:37 Vladimir 'phcoder' Serbinenko wrote:
Even if they can't stop from working at all they can make it
effectively useless by e.g. not allowing you to see online videos, buy
online or even just send an e-mail (saying it's "spam control") if you
aren't TPM-checked
That falls under the supporting-possibly-harmful-technology argument. It's not
very different from saying "you must use Silverlight to view videos" and
whatnot. If you don't want to follow their requirements, then don't.
2) The similar features can be implemented without resorting to TPM by
using coreboot and make every stage verify the signature of every next
stage.
Trust has to start somewhere, and the more difficult it is to compromise
that the better.
flash rom with cut write wire is impossible to compromise without
physical access.
Valid solution, but does it protect the contents of the flash ROM? (i.e. can
you read the contents?) A minor point is that it does mean you can't upgrade
your BIOS anymore. It also gets tricky if you're wanting to securely store a
hardrive decryption key though.
3) Read the PCR (TPM_PCRRead command) and compare it to a recorded
value of a previous (safe) boot. We assume that the previous link of
the chain of trust (BIOS?) has already checked that GRUB hasn't been
tampered before starting it.
You propose to check that our checksum in PCR is ok but you already
assume GRUB wasn't tampered. If you assume grub wasn't tampered no
need to checksum. If you don't it's useless to checksum.
That isn't assumed -- the BIOS checks that GRUB isn't tampered with
before moving control to it.
Coreboot can make this too. And firmware doesn't need TPM to do such
checks.
Yes, except coreboot isn't widely supported.
A full support of TPM means that GRUB should also be able to ask to a
remote authority if the content of the PCR is still ok...
Why do I as user need someone else to check my computer?
Because you don't always own or completely control the computer.
Then someone is already holding you hostage. We won't help them to
restrict your freedom further.
Or you're the person who owns and wants to secure the computer. Maybe you want
to co-locate your server and make sure the technicians at the DC can't
compromise it, or you're guarding against data loss if your laptop gets
stolen without having to enter decryption passwords on boot, or a whole lot
of other situation where *you* are putting *your* computer in an untrusted
environment.
Suppose you are the proud technical support person at a third-world
school that just bought a thousand OLPC XOs. You, as part of your
country's government, are instructed to own those XOs. If they are
stolen and get into the hands of innocent third-world civilians who want
a computer, those computers are instructed to shut off within 24 hours
and demand to be returned to their original location before they're
willing to work again. This harms ordinary users who loved their
student computer; they might choose not to return it anyway. It does
not harm the thieves who steal the XOs and break them down into parts
and sell the parts. Civilians with enough expertise might be able to
replace the BIOS flash to get a working computer again, but there will
be many ordinary users who can't figure out how to do this (or perhaps,
don't have the tools). This restriction is not necessarily what makes
OLPC designers happy, it is merely a condition set by many governments
who decided to shell out lots of money for the XOs for schools.
Technical measures can never decide who, morally, "owns" a computer.
When software technical measures do try to decide that, it is almost
always the technically adept and/or the manufacturers who win. In a
project for a Free world, I suspect we want to give our best chance to
anyone who ends up with a computer by any means... It's why I still have
set an unrestricted boot option on my computer (without access to my
personal-data encryption password of course - it's in my head) so that
if someone accidentally ends up with my computer and doesn't return it
and doesn't figure out how to reinstall an OS on it, the computer won't
economically go completely to waste... :-)
The cost of taking that stance is that when you remote-access a
computer, it (more easily)* might be running different software than you
expected. Is that so unreasonable? If you need an Internet-connected
secured computer, maybe you can put it in your home, or rent one from a
local company (or friend) who you trust to keep your data safe because
you know them?
*some options:
- Lock down not very much at all: Let anyone boot a CD, or even log in
directly as root, or at least replace your hard drive. Allowing only
the former probably makes you safer from someone lazy grabbing your
computer out of your hands and deleting your files in a stroke of anger;
adding some time/practicality delay that's still much less than the
nuisance of replacing hardware can be an okay compromise.
- Lock down via open chain of trust: Coreboot, and so forth, verifying
signatures. Booting a CD, and removing/modifying/replacing your hard
drive, neither will allow the computer to boot something different.
Different software can happen if "attacker" figured out how to
physically replace your BIOS. This is claiming ownership of your
computer for as long as it remains your computer (or until someone
steals your personal passwords or personal crypto-keys). It's open
design, but it's worth noticing that by choosing even this, you are
still trying to using technical measures to decide who owns a
computer... it just gets less 1984-esque when someone does decide to
replace your computer's BIOS (they can use a standard chip rather than a
horrible hack discovered by black hats)... This choice might be a good
one to use in airplane cockpits.
- Lock down via proprietary crypto chip (TPM). Different software can
happen if "attacker" figured out how to break into your TPM, which is
actually quite possibly easier, not harder, than replacing hardware
because the TPMs are closed systems that don't disclose their design and
flaws... This option is not safe from TPM manufacturers even if it does
*seem* convenient and secure (considering how many PCs have TPMs these
days). This might be okay for airplanes because -Airplane manufacturers
are big enough to negotiate with TPM manufacturers -Airplane control
systems had better never function as ordinary computers for ordinary
people! (-Isolating the risks in a smaller chip might be safer from
electromagnetic effects; Except that you don't actually get reliability
that way. You can make every security measure here, and even TPM remote
attestation, flawed, as soon as your RAM becomes unpredictable. Not in
a convenient way, but it should definitely be possible..) Also, none of
the airplane arguments really apply to small, non-life-critical systems.
If car manufacturers build PCs into the cars for people's enjoyment,
the PCs should not be locked down; the critical circuits should use
separate chips anyway, because it's just better engineering practice not
to rely on a fast multi-purpose computer when you don't have to.
I think, we need to be activists for open (e.g. Coreboot-based)
security. Fewer of its possible scenarios lead to dystopian
circumstances. Too many people expect and demand a logical chain of
security for their computers (I'm not one of them, I don't want to lock
down my laptop, as above). I don't know if this chain of security is
"useful" in an absolute sense, but it is nevertheless part of the
struggle to make computers more open and understandable, including
making people understand better the comparative role of TPM. I believe
this role is: a very badly implemented form of basically the Coreboot
chain of things, plus a form of remote attestation that requires you (or
anyone) to tech-battle the manufacturer to circumvent (or instead of
battling, maybe you're an agency that can convince manufacturers to give
you a backdoor. Money and slimy promises are good tools for this.). I'm
not sure, I might be missing something here -- what are you thinking
about it?
-Isaac
_______________________________________________
Grub-devel mailing list
Grub-devel@gnu.org
http://lists.gnu.org/mailman/listinfo/grub-devel