Thor Lancelot Simon wrote:
So, you sign the public key the chip generated, and inject the _signed_
key back into the chip, then package and ship it. This is how the SDK
for IBM's crypto processors determines that it is talking to the genuine
IBM product. It is a good idea, and it also leaves the chip set up for
you with a preloaded master secret (its private key) for encrypting other
keys for reuse in insecure environments, which is really handy.
But do we really think that general-purpose CPUs or DSPs are going to
be packaged in the kind of enclosure IBM uses to protect the private keys
inside its cryptographic modules?
... long post warning :) ...
that is basically a certificate-based process .... i.e. a recognized
certification authority is signing the exported public key and injecting
it back into the chip ... as a form of digital certificate.
this allows that some relying party ... that has a copy of the
appropriate certification authority's public key to validate the
device's digital certificate in an offline manner.
the approach i described was not the offline pki-based offline scenario
but the certificateless flavor ... the "relying party" accepts the
public key and contacts the authoritative agency managing/hosting the
fab's manifest. the authoritative agency then returns whether it is an
original chip (rather than possibly a counterfeit / copy chip) and
possibly also the integrity characteristics of the particular chip.
in any case, can you say "parameterized risk management" :)
with respect to the "kind of enclosure IBM users to protect the private
keys inside the cryptographic modules" is that the integrity
characteristics of any specific kind of chip is likely to be
proportional to the vulnerabilities, threats, risks and purposes that
the chip is used for. the high level of integrity for the ibm crypto
unit's private key isn't directly related to the cost of the unit and/or
whether it is a counterfeit unit ... it is much more related to various
anticipated uses that the ibm crypto unit will be applied.
say a 10-50 cent security chip that has been evaluated to EAL5-high
.... possibly even less ... see discussion here
http://www.garlic.com/~lynn/aadsm24.htm#28 DDA cards may address the UK
chip&Pin woes
... the integrity and protection of the private key is likely going to
proportional to the purposes for which the chip will be used.
Part of the least expensive process ... is that other than the 20k-40k
additional circuits ... the actual processing, processing steps, and
processing time is done in such a way that there is absolutely no
different from what they are doing today ... the initial power-on/test
to validate a working chip (before it has been sliced and diced from the
wafer) is the same exact step taking the same exact amount of time. the
exporting of the test fields indicating a valid working chip as part of
power-on/test ... is not changed ... other than there are a few more
bits that represent the exported public key. the storage and maintenance
in the fab chip manifest is exactly the same.
There is no incremental cost and no incremental processing ... other
than the chip real estate for additional 20k-40k circuits.
If you treat it as a real security chip (the kind that goes into
smartcards and hardware token) ... it eliminates the significant
post-fab security handling (prior to finished delivery), in part to
assure that counterfeit / copy chips haven't been introduced into the
stream .... with no increase in vulnerability and threat.
So finally it comes down to later wanting to check whether you have a
counterfeit / copy chip. The current scenario would be to read out the
static data serial number and have that looked up in the fab's chip
manifest. however serial number static data is vulnerable to things like
skimming and replay attacks. So in the basic operation ... for
effectively zero incremental cost ... you effectively get a dynamic data
serial number authentication for looking up in the fab's chip manifest
(as opposed to a simple static data serial number).
For nearly all uses of such a basic chip configuration, the cost of
attacking the private key (in such a eal5-high evaluated chip) is much
more than any likely benefit ... and is bracketed by being able to flag
the chip serial and public key in the fab's chip manifest.
As an attack purely for the purposes of selling 50 cent copy chips ...
each chip attack is going to cost enormously more than expected fraud
revenue.
So you have to be expecting something other than a revenue from selling
copy chips .... to mount such an attack, you would have to be expecting
to be able to make use of the private key for some significantly larger
benefit than selling a copy chip.
If you are talking about an attack on the private key ... for purpose
other than selling a copy chip ... then you are into security
proportional to risk ... i.e. having a variety of chips with integrity
proportional to risks of their expected use ... some expected uses far
above an EAL5-high evaluation ... may an EAL10 :) or EAL25 :) evaluation?
So for extremely close to zero cost ... you can add straight private key
and digital signature to any chip as countermeasure to counterfeit and
copy chips. As a side-effect ... it may possible to also use the digital
signing capability of the embedded circuits to represent "something you
have" authentication. However, the utilization of any such side-effect
should be evaluated from the standpoint of the integrity of the chips
private key environment and whether it is proportional to the risks
associated with the expected application uses.
now when i was talking about this with some government types ... within
the context of parameterized risk management ... i.e. the integrity of
the chip and the associated private key integrity could be dynamically
evaluated to see whether that it satisfied the requirements for the
purposes it would be applied ... they commented that this area was
totally missed in the work on x.509v3 digital certificates. the
commented that if i were to develop an integrity level grading system
(for a real-time, online parameterized risk management operation being
able to dynamically take into account chip integrity ... including that
the chip integrity may have degraded since it had been originally
manufactured ... i.e. advances/changes in attack technology/knowledge
may increased the chip vulnerability and lowered its integrity) ... then
they would see that x.509v3 digital certificates were extended to allow
specify a static flavor of chip integrity level.
the basic process was that private key, digital signatures and public
key could be added to existing chips at absolutely ZERO additional cost
(other than the 20k-40k additional circuits) as a countermeasure to copy
chips (where the existing mechanism involves lookup using static serial
number) ... aka countermeasure to copy chips. additional uses of such a
private key and digital signature capability has to be evaluated against
the basic integrity level of the chip (& private key) against the risks
associated with the target uses.
Some simple armoring of the private key comes with the design of the
basic 20k-40k additional core (i.e. in many respects, the additional
circuits operate as a separate computer core and nothing is directly
available to the primary processor). That level of integrity may, in
fact, be sufficient for a large number of applications.
so ... instead of having a lookup parameterized risk management system
(as originally described) ... the integrity level stuff might indeed be
retrofitted to stale, static x.509v3 certificates. in the ibm scenario
... the crypto unit would have an evaluated integrity level ... the
public key is exported ... some sort of digital certificate is created
with the crypto unit's public key and the crypto unit's integrity level
... and the result is digitally signed ... by some sort of certification
authority .... and the certificate is injected back into the crypto
unit. future users of the crypto unit can not only extract the digital
certificate to validate it is an "original" crypto unit (as opposed to
possibly a counterfeit or copy unit) ... and also have the integrity of
the crypto unit at the time it was manufactured (for a moment ignoring
that the integrity level of the crypto unit may degrade over time as
technology advances) ... and evaluate whether the certified integrity
level is sufficient for the uses for which it will be applied.
in the lookup parameterized risk management ... there is absolutely
no change in current day standard fab chip processing ... the whole
thing is submerged into processes that already occur. I think i was
quoted something like a couple pennies per chip per second of additional
processing. For the fundamental process, I had incentive ... to
incorporate the key-gen and public key export into the existing fab chip
processing so there was absolutely no increase in elapsed time of
initial chip power-on/test.
NOTE, there is a basic premise here that parameterized risk management
doesn't require that there can be one and only one integrity level that
has to be met by all devices for all purposes .... it can assume that
the required integrity level need only be sufficient to the purposes for
which it will be applied. If it is only going to be used to raise the
barrier for copy chip vulnerabilities for chips that are priced at tens
of cents to tens of dollars ... you might choose one level of private
key armoring. The level of private key armoring might be increased if
you start talking about copy chip countermeasures for chips that cost
hundreds or thousands of dollars.
The risk/threat landscape can also be considerably different if you are
doing dynamic, online, real-time lookup or if you depending on a stale,
static, offline digital certificate environment.
Another dynamic might be if such a design was incorporated into a
variation of RFID chips where the RFID chip is then incorporated into a
pill bottle worth hundreds of dollars and targeted as countermeasure to
counterfeit/copy drug vulnerability (i.e. one of the issues in the
original 40k circuit design from the late 90s was extremely low power
requires to work in contactless, radio frequency deployments)
as aside, the patents referenced in the original post (and which we
no longer have any relationship)
http://www.garlic.com/~lynn/aadsm24.htm#49 Crypto to defend chip IP:
snake oil or good idea?
allowed for both digital certificate modes of operation and
certificateless operation.
some recent posts mentioning contactless/proximity and/or power/rf
design considerations in the original aads chip strawman:
http://www.garlic.com/~lynn/aadsm23.htm#56 UK Detects Chip-And-PIN
Security Flaw
http://www.garlic.com/~lynn/aadsm24.htm#1 UK Detects Chip-And-PIN
Security Flaw
http://www.garlic.com/~lynn/aadsm24.htm#2 UK Banks Expected To Move To
DDA EMV Cards
http://www.garlic.com/~lynn/aadsm24.htm#5 New ISO standard aims to
ensure the security of financial transactions on the Internet
http://www.garlic.com/~lynn/aadsm24.htm#7 Naked Payments IV - let's all
go naked
http://www.garlic.com/~lynn/aadsm24.htm#8 Microsoft - will they bungle
the security game?
http://www.garlic.com/~lynn/aadsm24.htm#27 DDA cards may address the UK
Chip&Pin woes
http://www.garlic.com/~lynn/aadsm24.htm#28 DDA cards may address the UK
Chip&Pin woes
http://www.garlic.com/~lynn/aadsm24.htm#30 DDA cards may address the UK
Chip&Pin woes
misc. past posts mentioning parameterized risk management
http://www.garlic.com/~lynn/aadsmore.htm#bioinfo2 QC Bio-info leak?
http://www.garlic.com/~lynn/aadsmore.htm#bioinfo3 QC Bio-info leak?
http://www.garlic.com/~lynn/aadsmore.htm#biosigs biometrics and
electronic signatures
http://www.garlic.com/~lynn/aadsm2.htm#strawm3 AADS Strawman
http://www.garlic.com/~lynn/aepay3.htm#x959risk1 Risk Management in AA /
draft X9.59
http://www.garlic.com/~lynn/aepay6.htm#x959b X9.59 Electronic Payment
standard issue
http://www.garlic.com/~lynn/aadsm3.htm#cstech3 cardtech/securetech & CA PKI
http://www.garlic.com/~lynn/aadsm3.htm#cstech4 cardtech/securetech & CA PKI
http://www.garlic.com/~lynn/aadsm3.htm#cstech5 cardtech/securetech & CA PKI
http://www.garlic.com/~lynn/aadsm3.htm#cstech9 cardtech/securetech & CA PKI
http://www.garlic.com/~lynn/aadsm3.htm#cstech10 cardtech/securetech & CA PKI
http://www.garlic.com/~lynn/aadsm3.htm#kiss2 Common misconceptions, was
Re: KISS for PKIX. (Was: RE: ASN.1 vs XML (used to be RE: I-D ACTION
:draft-ietf-pkix-scvp-00.txt))
http://www.garlic.com/~lynn/aadsm12.htm#17 Overcoming the potential
downside of TCPA
http://www.garlic.com/~lynn/aadsm19.htm#15 Loss Expectancy in NPV
calculations
http://www.garlic.com/~lynn/aadsm19.htm#44 massive data theft at
MasterCard processor
http://www.garlic.com/~lynn/aadsm19.htm#46 the limits of crypto and
authentication
http://www.garlic.com/~lynn/aadsm2.htm#stall EU digital signature
initiative stalled
http://www.garlic.com/~lynn/aadsm21.htm#5 Is there any future for
smartcards?
http://www.garlic.com/~lynn/aadsm21.htm#8 simple (&secure??) PW-based
web login (was Re: Another entry in the internet security hall of shame....)
http://www.garlic.com/~lynn/aadsm23.htm#1 RSA Adaptive Authentication
http://www.garlic.com/~lynn/aadsm23.htm#27 Chip-and-Pin terminals were
replaced by "repairworkers"?
http://www.garlic.com/~lynn/99.html#235 Attacks on a PKI
http://www.garlic.com/~lynn/99.html#238 Attacks on a PKI
http://www.garlic.com/~lynn/2000.html#46 question about PKI...
http://www.garlic.com/~lynn/2000.html#57 RealNames hacked. Firewall issues.
http://www.garlic.com/~lynn/2001.html#73 how old are you guys
http://www.garlic.com/~lynn/2003j.html#33 A Dark Day
http://www.garlic.com/~lynn/2003p.html#26 Sun researchers: Computers do
bad math ;)
http://www.garlic.com/~lynn/2004h.html#38
build-robots-which-can-automate-testing dept
http://www.garlic.com/~lynn/2005k.html#23 More on garbage
http://www.garlic.com/~lynn/2006g.html#40 Why are smart cards so dumb?
---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]