UPnP Security specs available for review

2003-08-22 Thread Carl Ellison
http://www.upnp.org/draftspecs/

Enjoy,

Carl


++
|Carl Ellison  Intel R & D   E: [EMAIL PROTECTED] |
|2111 NE 25th AveT: +1-503-264-2900  |
|Hillsboro OR 97124  F: +1-503-264-3375  |
|PGP Key ID: 0xFE5AF240  |
|  1FDB 2770 08D7 8540 E157  AAB4 CC6A 0466 FE5A F240|
++

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: UPnP Security specs available for review

2003-08-25 Thread John Gilmore
Carl,

What's the design lifetime of this security system?

1024 bit RSA is too short.  If you're going to all the trouble to
build a supposedly secure system, use a length that won't be broken.
My suggestion these days is significantly north of 2048 bits.  Don't
use a power of two, and, ideally, use key lengths that vary among
devices, so that there's no "sweet spot" for someone to build a
key-cracking machine for.

E.g. One device that implements your spec might have a key length of
2432 bits; the next one a length of 2200 bits; a third 2648 bits.

It's clear that the crypto implemented in these devices in the near
future is going to be in iterative software, rather than wide
hardware, so there's no reason to limit the keys to 1024 bits except for
performance and the tiny cost of memory space.  And we all know that:

   the number of public-key operations required is low
   the latency of public-key operations is usually negligible at system level
   the performance of hardware always increases rapidly
   available memory always increases rapidly

So don't fall into the NSA trap that's plagued cellphones and every
other consumer device.  Don't build a security system for every
household device that is "secure ENOUGH" but has a weak link designed
in.

And don't forget that Intel wants to sell that increased-performance
hardware too.  So the crypto system should be right at the "sluggishly
slow" point when first released.  Because two or three years out, new
devices will be "plenty fast" and then a few years later any crypto
overhead will be "invisible" in new devices.

Also, once you have established session keys between two devices, why
would you EVER send plaintext between them (page 6, paragraph 9)?  The
spec should say that plaintext messages will not be accepted, and the
implementations should definitely ignore any that arrive.  This is the
failure mode of US cellphones: the TDMA and CDMA standards define a
(poor) encryption scheme, but even though it's in every phone, the
cellphone service vendors have all been pressured to disable it on
every call.  (In fact it isn't even built into the base stations.)  IF
IT'S POSSIBLE TO DISABLE THE CRYPTO, THE GOVERNMENT WILL CAUSE IT TO
HAPPEN IN PRACTICE.  So spec it so each device *must* have the secure
crypto, on every message, or it won't interoperate.

Also, what's this business about manufacturers generating the
long-term keys and putting them in the devices and not letting users
change them (pg 6, first sentence)?  Have you gone over to the Dark
Side?

How many seconds would it take for a rogue Security Console to try all
possible 6-uppercase-letters passwords after you plug in a device
(e.g. to charge) and before you try to control it from your own
Security Console?  2 milliseconds per try (500 tries a second) seems
like a high estimate, for an operation that only has to check a hash,
especially ten years from now.  I didn't work out the message lengths,
but on a 10 mbit ethernet, at wire speed, wouldn't the attacker be
able to try passwords faster than this today?  And what about on the
10 gbit ethernet that'll be default in ten years, with the hash done
in the invisibly small 40 GHz embedded processor that'll be default in
ten years?

John

PS: It's nice that on page 7 you tell manufacturers in paragraph 14
how to build back doors into devices.  As is obvious in current
telecomm systems (including "anonymity" software), if these are
buildable, then the government will pass a law mandating their use to
subvert the user's control over their own life and privacy.  E.g. this
is where you'd put the "Execute DRM" interface, where Senator Hatch
will send a message that destroys the device if hollywood thinks you
aren't a subservient godfearin' amurrican.  And the CALEA interface.

PPS: I stopped at page 7; these comments were getting long and I was
losing interest.  Carl, You know better than to design in each of
these flaws, so presumably if you'd had the power to fix them, I
wouldn't have to send in these comments.  Thus, none of them are going
to get fixed.  Right?  Why did you bother publishing this spec?  Might
as well have all the mfrs just agree on it in secret and ram it down
our throats.


-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]


Re: UPnP Security specs available for review

2003-08-26 Thread Carl Ellison
Hi John.

I'm sorry you were disappointed.  I appreciate your comments on the
overview and summary, though.

1024-bit is not an upper limit in key size - but a lower limit.  I
appreciate your suggestion of varying key lengths and am glad that
you have put it in the open literature (this mail list).  We
explicitly plan for letting people add algorithms or key lengths of
their choice, should they find the defaults unacceptable.

As with any other standard, this is not an individual effort but a
group activity in standards committee.  A number of our members have
or plan to have small devices with limited processing power.  There
was significant resistance to the expense of public key operations,
which is why we restricted it to two actions: TakeOwnership and
SetSessionKeys, both rare, and set the default key size at 1024-bits.

Why we would ever send plaintext is that encryption is seen as
expensive for these limited devices.  It is additionally expensive in
this design because we were limited by the inability to violate the
pre-existing device architecture to using a tunneling method for
encryption.  That turns out to have been an advantage for a variety
of reasons, but it also increased the expense.  However, any
manufacturer is free to encrypt all traffic, if it chooses.  That's
all up to the manufacturer.

On pp. 24-25, we give advice to manufacturers on how to choose a
TakeOwnership password length.  The actual choice is up to the
manufacturer, of course.

I'm glad you enjoyed point #14.  That was a design requirement from
the beginning of the project.  That is, in some devices (e.g., a
remotely accessible power meter), it was envisioned that there will
be functions that the manufacturer reserves to itself.  Rather than
have the manufacturer muck with the normal device's access control
structure and make its behavior strange, we strongly suggest that the
manufacturer create a sub-device to carry its own reserved functions
and leave that sub-device owned by the manufacturer from the
beginning.

This mail thread might serve as advice to manufacturers.  The
suggestions you have made do not contradict the spec, but should be a
valuable addition to it.  Thanks again.

 - Carl

At 12:38 AM 8/23/2003 -0700, John Gilmore wrote:
>Carl,
>
>What's the design lifetime of this security system?
>
>1024 bit RSA is too short.  If you're going to all the trouble to
>build a supposedly secure system, use a length that won't be broken.
>My suggestion these days is significantly north of 2048 bits.  Don't
>use a power of two, and, ideally, use key lengths that vary among
>devices, so that there's no "sweet spot" for someone to build a
>key-cracking machine for.
>
>E.g. One device that implements your spec might have a key length of
>2432 bits; the next one a length of 2200 bits; a third 2648 bits.
>
>It's clear that the crypto implemented in these devices in the near
>future is going to be in iterative software, rather than wide
>hardware, so there's no reason to limit the keys to 1024 bits except
>for performance and the tiny cost of memory space.  And we all know
>that:  
>
>   the number of public-key operations required is low
>   the latency of public-key operations is usually negligible at
> system level 
>   the performance of hardware always increases rapidly
>   available memory always increases rapidly
>
>So don't fall into the NSA trap that's plagued cellphones and every
>other consumer device.  Don't build a security system for every
>household device that is "secure ENOUGH" but has a weak link
>designed in.
>
>And don't forget that Intel wants to sell that increased-performance
>hardware too.  So the crypto system should be right at the
>"sluggishly slow" point when first released.  Because two or three
>years out, new devices will be "plenty fast" and then a few years
>later any crypto
>overhead will be "invisible" in new devices.
>
>Also, once you have established session keys between two devices,
>why would you EVER send plaintext between them (page 6, paragraph
>9)?  The spec should say that plaintext messages will not be
>accepted, and the implementations should definitely ignore any that
>arrive.  This is the failure mode of US cellphones: the TDMA and
>CDMA standards define a
>(poor) encryption scheme, but even though it's in every phone, the
>cellphone service vendors have all been pressured to disable it on
>every call.  (In fact it isn't even built into the base stations.) 
>IF IT'S POSSIBLE TO DISABLE THE CRYPTO, THE GOVERNMENT WILL CAUSE IT
>TO HAPPEN IN PRACTICE.  So spec it so each device *must* have the
>secure crypto, on every message, or it won't interoperate.
>
>Also, what's this business about manufacturers generating the
>long-term keys and putting them in the devices and not letting users
>change them (pg 6, first sentence)?  Have you gone over to the Dark
>Side?
>
>How many seconds would it take for a rogue Security Console to try
>all possible 6-uppercase-letters passwords after you plug

Re: UPnP Security specs available for review

2003-08-26 Thread Carl Ellison
At 12:38 AM 8/23/2003 -0700, John Gilmore wrote:
>Carl,
>
>What's the design lifetime of this security system?

The original lifetime plan was 2-3 years before being replaced by UPnP V2, plus some 
number of years for eventual replacement of older equipment in the field.  We don't 
have a firm release date for V2, however, so I can't be sure of that lifetime.

V2 is expected to be a profile of Web Services Security.

 - Carl


++
|Carl Ellison  Intel R & D   E: [EMAIL PROTECTED] |
|2111 NE 25th AveT: +1-503-264-2900  |
|Hillsboro OR 97124  F: +1-503-264-3375  |
|PGP Key ID: 0xFE5AF240  |
|  1FDB 2770 08D7 8540 E157  AAB4 CC6A 0466 FE5A F240|
++

-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]