And why should we trust hardware implementations, which are even more opaque
to analysis than binary-only software?

Enzo


----- Original Message -----
From: "Eugene Leitl" <[EMAIL PROTECTED]>
To: "Rick Smith" <[EMAIL PROTECTED]>
Cc: "Arnold G. Reinhold" <[EMAIL PROTECTED]>; "John Gilmore"
<[EMAIL PROTECTED]>; <[EMAIL PROTECTED]>; <[EMAIL PROTECTED]>
Sent: Thursday, May 25, 2000 7:41
Subject: Re: NSA back doors in encryption products


> Rick Smith writes:
>
>  > 3) A more sophisticated backdoor in Windows would involve a lot of
people
>  > who can't be covered by government secrecy agreements. It would be
>  > extremely difficult to keep such a thing both functioning and secret
for
>  > more than a few years.
>
> The canonical way to install an essentially undetectable trapdoor is
> to use multiple remotely exploitable buffer overruns. In absence of
> source code, this is essentially impossible to detect, unless
> triggered by chance (which can be made arbitrarily improbable by
> design, e.g. by cooperation of a mailer with an OS) and deliberate
> introduction of such can always claim plausible
> deniability. Cryptography on a box running keyboard capture is pretty
> useless, of course.
>
> If NSA/MS are not doing it, they must be pretty stupid, because I'd do
> it in their place. The prudent assumption is hence: your online system
> can't be completely trusted, whether OpenSource, or not. Encryption
> should be done in hardware.
>


Reply via email to