At 13:55  +0100 2007/05/23, Dave Korn wrote:
On 21 May 2007 19:44, Perry E. Metzger wrote:


 http://www.physorg.com/news98962171.html

 My take: clearly, 1024 bits is no longer sufficient for RSA use for
 high value applications, though this has been on the horizon for some
 time. Presumably, it would be a good idea to use longer keys for all
 applications, including "low value" ones, provided that the slowdown
 isn't prohibitive. As always, I think the right rule is "encrypt until
 it hurts, then back off until it stops hurting"...

  It's interesting, but given that they don't (according to the article)
appear to have used any innovative techniques, just yer bog-standard special
NFS, shouldn't we really just file this under the "Moore's law continues to
apply as expected" folder?  It's not the same degree of worrying as TWINKLE
and TWIRL.

Last night at the Eurocrypt rump session Arjen said that there were two things that were different this time. One is that they got all the big factoring groups to work together for the first time. The other is that they managed to distribute the matrix reduction phase to four machines... previously this has always needed a single huge machine. It sounds like a big deal to me too. (I don't know how they did it. I look forward to trying to understand the details.)

I'll get the quote wrong, but he also said something like:
"doing 1024 bits sounds about 5 times easier now, than doing 768 bits did in 1999 when we did 512 bits."

Greg.

---------------------------------------------------------------------
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]

Reply via email to