> If crypto is performed by hardware, how sure can users/designers be
that it
> is truly secure (since one can't examine the code)?

I'm currently microprogramming the 2800, and have worked on a crypto
ASIC in Verilog.
Some comments, food for thought:

You *can* examine the code if the manufacturer understands the needs of
security folks.
(This ranges from inspectable design stages (specs -> RTL -> GDSII)  to
taking your raw RNG output to a pin for diagnostics, which Intel forgot
to do when they made an RNG.)  Inspectable design != free source code.

Buy your chips anonymously from Frys, and test some, sampling randomly.

Testing may mean taking them to ChipWorks and getting them stripped,
reverse engineered up to
the functional level.  If you have access to the RTL, you can run
comparisons.  Look for those
extra gates the layout guy secretly added in return for citizenship for
his family.

You might also use "blank" reconfigurable logic devices (Xilinx, Altera)
and program them.  However the programming
is more mutable in the field than silicon gates.  But you don't pay a
million bucks for a mask set..

Beware JTAG, BTW, in Crypto.

Is there any way to
> determine whether standard forms of encryption have been monkeyed with
in
> some way (ie, to make those with certain backdoor keys have access at
will,
> and yet still conform to he standard as far users can see)?
> And, are hardware-based encryption implementations considered suspect
from
> the standard by the more "careful" parts of the crypto community?

Reverse engineering is the operational solution.  So is having vetted
design/fab folks, eg, NSA.
And marines with dogs guarding various things.

Hardware is fine with crypto folks.  There is a tradeoff of mutability
vs. observability compared
to software.  Hardware is fine for embedded devices, and a lot of pro
mil security is in embedded devices.
TV decoder PODs too; smartcards I suppose for euros.

Hardware crypto is just another point in design space; it can save power
because its more efficient than a
general cpu, but it costs boardspace & complexity.  It can achieve
higher throughput because its hardware,
if you need it.

One can also argue that if you have to worry about cryptochips getting
swapped in their
sockets at night, that you have many other (easier for the adversary)
concerns already ---Got Scarfo?
Ie checked your cables recently?

However as some very skilled reverse engineers have shown, you have to
be careful who physically has the
hardware.  The *banks* secrets in *hacker* wallets is bad.  *Your* key
in *your* wallet is good.
Do not lend Paul Kocher your hotel keycard :-)

In summary, there is no perfect solution (hey, Synopsys could recognize
an S-box and add something special,
like Ritchie's evil Turing Award compiler..), but there is a niche for
hardware.  Since there is more space on the die
and since you compete on extra features, expect to see crypto support
pop up in low and high end chips with
network applications.  Crypto functionality (or ethernet interfaces, for
that matter) is to commodity cpu makers what cup-holders and
seat-warmers are to car makers.

Why bother with hardware subversion when there is CALEA?  Anyone (telco,
VoIP) buying the systems (Cisco) made with crypto chips (Intel) will
have to buy systems programmed to tap.  If you can buy the heads of
cointel at FBI and CIA
for under $2e6 each, do you really need to black bag a fab?

Reply via email to