Re: fyi: bear/enforcer open-source TCPA project

2003-09-08 Thread bear


On Mon, 8 Sep 2003, Sean Smith wrote:

>How can you verify that a remote computer is the "real thing, doing
>the right thing?"

You cannot.

>In contrast, this code is part of our ongoing effort to use open
>source and TCPA to turn ordinary computers into "virtual" secure
>coprocessors---more powerful but less secure than their high-assurance
>cousins.

The correct security approach is to never give a remote machine
any data that you don't want an untrusted machine to have. Anything
short of that *will* be cracked.

Bear

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: fyi: bear/enforcer open-source TCPA project

2003-09-09 Thread Sean Smith
> 
> >How can you verify that a remote computer is the "real thing, doing
> >the right thing?"
> 
> You cannot.

Using a high-end secure coprocessor (such as the 4758, but not
with a flawed application) will raise the threshold for the adversary
significantly.

No, there are no absolutes.  But there are things you can do.
 
> The correct security approach is to never give a remote machine
> any data that you don't want an untrusted machine to have. 

So you never buy anything online, or use a medical facility
that uses computers?





-- 
Sean W. Smith, Ph.D. [EMAIL PROTECTED]   
http://www.cs.dartmouth.edu/~sws/   (has ssl link to pgp key)
Department of Computer Science, Dartmouth College, Hanover NH USA




-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: fyi: bear/enforcer open-source TCPA project

2003-09-10 Thread bear


On Tue, 9 Sep 2003, Sean Smith wrote:

>>
>> >How can you verify that a remote computer is the "real thing, doing
>> >the right thing?"
>>
>> You cannot.
>
>Using a high-end secure coprocessor (such as the 4758, but not
>with a flawed application) will raise the threshold for the adversary
>significantly.

The problem with this is Moore's law.  By the time your high-end
coprocessor is widely adopted, most of the actual units out there will
no longer be high-end.  And the kid who has the latest hardware will
always be able to emulate an older secure coprocessor in realtime, the
same way they used to use hacked printer drivers to simulate the
presence of hardware dongles on the parallel port. So this doesn't
work unless you put a "speed limit" on CPU's, and that's ridiculous.

>No, there are no absolutes.  But there are things you can do.

Yes.  Protocol designers have been explaining how to do them for
decades.  There is usually a protocol that allows untrusted machines
to only have data suited for handling by untrusted machines, while
still providing the appropriate benefits.

There are things you can't do that way, of course; a machine cannot
display information to a human that it does not have.  But a
remote-and-therefore-untrusted machine is in front of a
remote-and-therefore-untrusted human, and therefore ought not do such
a thing anyway.

Designing applications that use protocols to achieve design goals
without ever transmitting information that an untrusted machine ought
not have is hard.  But it is possible, and until it's done we're going
to see a parade of cracked applications and hacked hardware destroying
every business plan that's built on it and every life that depends on
it.  Depending on a solution that lets "remote but trusted" hardware
handle information that the remote machine shouldn't have in the first
place is an invitation to be hacked, and an excuse to avoid the hard
work of designing proper protocols.

>> The correct security approach is to never give a remote machine
>> any data that you don't want an untrusted machine to have.
>
>So you never buy anything online, or use a medical facility
>that uses computers?

Online credit-card purchases are ten percent fraudulent by volume.
Crypto is widely deployed for credit-card purchases, but stemming
fraud seems to be like trying to dig a hole in water.  Points made
here recently about who has a motive to stop fraud seem applicable.

And, significantly, much of this fraud is done by people who manage to
crack the merchants' databases of credit card numbers and accounts
which are kept in cleartext.  I don't think any crypto infrastructure
is going to stop "personal" card fraud by someone who's got your card
out of your wallet.  Boyfriends, Girlfriends, roommates, and siblings
commit a lot of fraud on each other.  But a better protocol design
should at least put credit card numbers in merchant databases out of
reach of crackers - by never letting the merchants get them in
cleartext.

A merchant should wind up with a unique "purchase code" - a blob from
which the bank (and no one else) ought to be able to tell the payee,
the amount, the source of funds, and the date of the transaction.
This is fairly simple to do with asymmetric encryption where the bank
has a published key.  A merchant should NOT wind up with a cleartext
credit card number for an online purchase.  Someone hacking the
merchant's database should NOT wind up with information that can be
"replayed" to commit frauds.  This isn't a matter of transmitting
priveleged (sensitive) information to a "remote but trusted" machine;
this is a matter of finding an appropriate (non-sensitive) form of
information that a remote machine can be trusted with.  No special
hardware is required, it's just a matter of using the appropriate
protocol.

Frankly I don't know enough about how medical records are handled to
say much about them - I couldn't even make a good assessment of the
operational requirements.  But the information has huge economic value
as well as huge personal privacy value.  Its inappropriate disclosure
or misuse can destroy lives and livelihoods.  It ought to be
considered, and protected, as a target for theft.

Bear


-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: fyi: bear/enforcer open-source TCPA project

2003-09-10 Thread Sean Smith

> So this doesn't
> work unless you put a "speed limit" on CPU's, and that's ridiculous.

Go read about the 4758.  CPU speed won't help unless
you can crack 2048-bit RSA, or figure out a way around
the physical security, or find a flaw in the application.


> Yes.  Protocol designers have been explaining how to do them for
> decades.  

But (at a high-level) there are things that are awkward
or extremely impractical to do with, say, multi-party computation.

That's where the "secure hardware" work---from Abyss, to TCPA, to
plastic-speckles, to the CPU+ work at MIT and Princeton---comes in.  



--Sean












-- 
Sean W. Smith, Ph.D. [EMAIL PROTECTED]   
http://www.cs.dartmouth.edu/~sws/   (has ssl link to pgp key)
Department of Computer Science, Dartmouth College, Hanover NH USA




-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: fyi: bear/enforcer open-source TCPA project

2003-09-11 Thread bear


On Wed, 10 Sep 2003, Sean Smith wrote:

>
>> So this doesn't
>> work unless you put a "speed limit" on CPU's, and that's ridiculous.
>
>Go read about the 4758.  CPU speed won't help unless
>you can crack 2048-bit RSA, or figure out a way around
>the physical security, or find a flaw in the application.

You propose to put a key into a physical device and give it
to the public, and expect that they will never recover
the key from it?  Seems unwise.

Bear

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: fyi: bear/enforcer open-source TCPA project

2003-09-11 Thread Sean Smith

>You propose to put a key into a physical device and give it
>to the public, and expect that they will never recover
>the key from it?

It's been on the market for six years now; so far, the foundation
has held up.(We also were darn careful about the design
and evaluation; we ended up earning the first FIPS 140-1 Level 4
cert, but went beyond it in several respects.)

But there are numerous war stories and drawbacks---which is
why I find the new generation of initiatives interesting.
(Particularly since I don't have to build products anymore! :)

> Seems unwise

As does the alternative proposition that one should NEVER, under any 
circumstances, have sensitive data or computation on a remote machine.

--Sean












-- 
Sean W. Smith, Ph.D. [EMAIL PROTECTED]   
http://www.cs.dartmouth.edu/~sws/   (has ssl link to pgp key)
Department of Computer Science, Dartmouth College, Hanover NH USA




-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: fyi: bear/enforcer open-source TCPA project

2003-09-11 Thread Rich Salz
> You propose to put a key into a physical device and give it
> to the public, and expect that they will never recover
> the key from it?  Seems unwise.

You think "the public" can crack FIPS devices?  This is mass-market, not
govt-level attackers.

Second, if the key's in hardware you *know* it's been stolen.  You don't
know that for software.
/r$
--
Rich Salz  Chief Security Architect
DataPower Technology   http://www.datapower.com
XS40 XML Security Gateway  http://www.datapower.com/products/xs40.html
XML Security Overview  http://www.datapower.com/xmldev/xmlsecurity.html


-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: fyi: bear/enforcer open-source TCPA project

2003-09-11 Thread Peter Gutmann
Rich Salz <[EMAIL PROTECTED]> writes:

>Second, if the key's in hardware you *know* it's been stolen.  You don't know
>that for software.

Only for some definitions of "stolen".  A key held in a smart card that does
absolutely everything the untrusted PC it's connected to tells it to is only
marginally more secure than a key held in software on said PC, even though you
can only steal one of the two without physical access.  To put it another way,
a lot of the time you don't need to actually steal a key to cause damage - it
doesn't matter whether a fraudulent withdrawal is signed on my PC with a
stolen key or on your PC with a smart card controlled by a trojan horse, all
that matters is that the transaction is signed somewhere.

Peter.

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


RE: fyi: bear/enforcer open-source TCPA project

2003-09-11 Thread Scott Guthery
There are roughly 1B GSM/3GPP/3GPP2 
SIMs in daily use and the number of 
keys extracted from them is diminishingly 
small.

-Original Message-
From: bear [mailto:[EMAIL PROTECTED]
Sent: Thursday, September 11, 2003 3:43 AM
To: Sean Smith
Cc: [EMAIL PROTECTED]
Subject: Re: fyi: bear/enforcer open-source TCPA project 




On Wed, 10 Sep 2003, Sean Smith wrote:

>
>> So this doesn't
>> work unless you put a "speed limit" on CPU's, and that's ridiculous.
>
>Go read about the 4758.  CPU speed won't help unless
>you can crack 2048-bit RSA, or figure out a way around
>the physical security, or find a flaw in the application.

You propose to put a key into a physical device and give it
to the public, and expect that they will never recover
the key from it?  Seems unwise.

Bear

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: fyi: bear/enforcer open-source TCPA project

2003-09-11 Thread Damian Gerow
Thus spake Rich Salz ([EMAIL PROTECTED]) [11/09/03 08:51]:
> > You propose to put a key into a physical device and give it
> > to the public, and expect that they will never recover
> > the key from it?  Seems unwise.
> 
> You think "the public" can crack FIPS devices?  This is mass-market, not
> govt-level attackers.

And 'the public' doesn't include people like government level attackers?
People like cryptography experts?  People who like to play with things like
this?

'The public' only includes the sheeple, and nobody else?

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: fyi: bear/enforcer open-source TCPA project

2003-09-11 Thread Rich Salz
And 'the public' doesn't include people like government level attackers?
People like cryptography experts?  People who like to play with things like
this?
No it doesn't.  *It's not in the threat model.*

	/r$

--
Rich Salz, Chief Security Architect
DataPower Technology   http://www.datapower.com
XS40 XML Security Gateway   http://www.datapower.com/products/xs40.html
XML Security Overview  http://www.datapower.com/xmldev/xmlsecurity.html
-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


is "secure" hardware worth it? (Was: Re: fyi: bear/enforcer open-source TCPA project)

2003-09-11 Thread Sean Smith

Just to clarify... 

I'm NOT saying that any particular piece of "secure" hardware can never be
broken.   Steve Weingart (the hw security guy for the 4758) used to insist that
there was no such thing as "tamper-proof." On the HW level, all you can do is
talk about what defenses you tried, what attacks you anticipated, and what
tests you tried.

What I am saying is that using "secure coprocessors"---defined loosely, to
encompass this entire family of tokens---can be a useful tool.  Whether one
should use this tool in any given context depends on the context. Are there
better alternatives that don't require the assumption of physical security?
How much flexibility and efficiency do you sacrifice if you go with one of
these alternatives? How dedicated is the adversary?  What happens if a few
boxes get opened?  How much money do you want pay for a device?

Some cases in point: it's not too hard to find folks who've chosen
a fairly weak point on the physical security/cost tradeoff, but still
somehow manage to make a profit.  

Of course his all still leaves unaddressed the fun research questions of how to
build effective coprocessors, and how to design and build applications that
successfully exploit this security foundation.  (Which is some of what I've
been looking into the last few years.)


--Sean

-- 
Sean W. Smith, Ph.D. [EMAIL PROTECTED]   
http://www.cs.dartmouth.edu/~sws/   (has ssl link to pgp key)
Department of Computer Science, Dartmouth College, Hanover NH USA




-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]