Robert Jueneman wrote:
[snip]

> To solve this problem, I proposed the notion of "strong but brittle
> cryptography" - cryptography that would be designed to break very
> easily, at a defined time in the future, but be very robust until that
> time.
> 
> This would involve say 100 institutions very interested in preserving
> historical records (museums, universities, religious institutions,
> newspapers, government archivists, etc.) coming together and
generating
> a set of very strong ECC key pair, using K out of N secret sharing
> techniques, where the K shares needed to recover the key might be say
> 70.
> 
> Those keys would be embedded in certificates with validity periods of
> 10, 20, 30, ... 100 years.
> 
> Every ten years, the institutions would come together and reconstitute
> the key that was due to expire on that particular date, and then
publish
> it to the world.
> 
> Records intended to be kept secret until that date would be very
secure,
> but after that date would be easily readable by anyone, with no
> cryptanalysis required.

A most excellent idea. This way we might find out the "truth" 
about Coventry, Pearl Harbor, the JFK assassination, 9/11, etc., 
one day.

(Warning, bad pun ahead.)

>The key problem is that it is a minority controlled release. 
Using your 70 of 100 metric, 31 could potentially deny release by 
refusing to participate, if I understand how your idea would work 
in practice. The other, but lesser problem, is being sure that 
enough of the conveners attend every ten years to make the 
changes needed. Given that not every institution survives as long 
as the University of Bologna, it could become an issue. I think 
that these could be resolved by a variety of reverse tontin rules.

---
RRJ:  Allen, the precise values of the K out of N participants could
certainly be jiggered.  I arbitrarily selected 100 as the value for N,
but whether it would be possible to convince 100 institutions to
participate is not clear, especially since the payoff probably wouldn't
occur in their lifetimes.

The problem of the conveners not surviving that long should not be that
severe. 

Several German beer companies and perfume manufacturers have been around
for well over 500 years; the University of Bologna, Oxford, the
Sorbonne, Harvard, and others have been around for 300 or more years;
the Vatican, the Church of England, the LDS Church, and other religious
institutions are likely to continue to be in existence for the next
millennium; banking institutions such as the Bank of England, the
Federal Reserve Board, some of the Swiss banks, etc.;  various chartered
foundations, quasi-governmental and fraternal organizations, including
the Smithsonian Institution, the National Geographic Society, the
British Museum, the Library of Congress, the International Red Cross and
Red Crescent organizations, the Ancient Free and Accepted Masons; and of
course agencies such as the National Archives, Presidential Libraries,
etc., will presumably be around for centuries.

Vice versa, by being reasonably selective in choosing the members, it
seems unlikely that a minority cabal would vote to deny the release of
historical records, and even if they did, that decision could be
reversed some time in the future (assuming the key shares were not
deliberately destroyed in the meantime).

--
>The other issue, which is true of all cryptographic functions, is 
unanticipated vectors of attack. As with the conveners issues, I 
suspect a carefully thought out, multi-level encryption structure 
could protect against this but the real problem is that one would 
need to make sure that *all* the old data that still needs to be 
protected is migrated to the next algorithm. If even one copy 
exists that has not been migrated then you might as well not have 
bothered to migrate any data at all when the algorithm is 
cracked. The consequences of this is that data would be released 
before its end of secrecy date. This might or might not be a big 
deal depending on the nature of the data. For example, 9/11 data 
that was released at 40, not 50 years, would not probably be all 
that big a deal, but DNA records released too soon might be a big 
deal if insurance companies are allowed to discriminate based on 
them.

RRJ:  This problem is probably impossible to solve completely.  Although
some non-cryptographic privacy schemes involving splitting the data into
shares have been proposed, there is a huge storage overhead associated
with such schemes.  The National Archives is projecting that the amount
of data that they will be storing will triple every five years, and by
mid-century, they are looking at 300,000 terabytes.  Going back and
re-encrypting that much data is simply never, ever going to happen.

The good news is that even if new attack vectors eventually become
feasible, it isn't likely that they will become practical.  For example,
when I first got involved in cryptography, single DES and RSA-384 were
considered commercially viable.  Today, 30+ years later, those standards
would be considered laughable, because either one would be breakable
with say a day's worth of time of a computer costing around a hundred
thousands dollars.  However, in the case of a file encryption system,
breaking a DES file encryption key costs you a day's worth of time, and
decrypts only a single file, e-mail message, or whatever.  That is
hardly a very attractive proposition, unless you have a file that is
still exceptionally valuable, thirty years after the fact.

An attack on a private key, e.g., the strongest currently standardized
key, an ECC P-521 key, would be extraordinarily difficult, at least
given our current state of knowledge - it ought to be good enough for
the next 280 years, according the current estimates, UNLESS quantum
cryptography becomes feasible.  If that were to happen, the strong but
brittle approach would suddenly become more like taffy than glass, and
the fact that a single key was used would make that the prize of all
prizes.

One possible way around that difficulty would be to use the basic K out
of N approach to generate million and perhaps billions of public keys
simultaneously, so that everyone could have their own copy.  But now you
would have to deal them out reliably, and with protection against
spoofing, and that is a significant problem in its own right.  This
requires more thought.

--

Well worth looking at and thinking about as an approach nevertheless.

Best,

Allen


_______________________________________________
FDE mailing list
[email protected]
http://www.xml-dev.com/mailman/listinfo/fde

Reply via email to