Re: NSA backdoors and Set Preferred Cipher

2013-09-08 Thread Werner Koch
On Sun,  8 Sep 2013 01:38, r...@sixdemonbag.org said:

 Twofish, but the recipient doesn't support it... then CAST5, but that's
 not supported... then Blowfish, again not supported... hey, 3DES.  3DES

Nitpicking: CAST5 is a SHOULD algorithm

   Implementations MUST implement TripleDES.  Implementations SHOULD
   implement AES-128 and CAST5.  Implementations that interoperate [...]

thus most applications will actually stop right here.


Salam-Shalom,

   Werner

-- 
Die Gedanken sind frei.  Ausnahmen regelt ein Bundesgesetz.


___
Gnupg-users mailing list
Gnupg-users@gnupg.org
http://lists.gnupg.org/mailman/listinfo/gnupg-users


Re: NSA backdoors and Set Preferred Cipher

2013-09-08 Thread Filip M. Nowak
On 09/08/2013 01:45 AM, Robert J. Hansen wrote:
 On 9/7/2013 4:59 PM, Filip M. Nowak wrote:
 Is CAMELLIA's pick as least preferred cipher - omitted/disregarded by
 NIST (US) but certified by NESSIE (EU) and CRYPTREC (Japan) - is somehow
 related to those revelations?
 
 NIST couldn't consider it for an AES candidate because it hadn't been
 invented yet.
 
 A brief timeline:
 
 1997: NIST begins the AES selection process
 1998: Rijndael is published
 2000: Camellia is published too late to be considered for AES
 2000: NESSIE begins evaluating new algorithms
 2000: CRYPTREC begins evaluating new algorithms
 2001: Rijndael is selected to become the Advanced Encryption Standard
 2003: CRYPTREC and NESSIE each select Camellia
 

Good point!


Regards,
Filip M. Nowak

___
Gnupg-users mailing list
Gnupg-users@gnupg.org
http://lists.gnupg.org/mailman/listinfo/gnupg-users


How can I send a public key exported with --armour option?

2013-09-08 Thread Francesco C.
Hello everybody, I'm a beginner of Gpg and encryption's world in general,
so I apologize if my questions will be so banal.

I created a new pair of public-private keys and now I'm trying to export
the public one.
I read the How-To and it describe the more useful option --armour. I
can't understand how to send a ASCII text print on my screen to an other
user.

Is not more useful the --output options?



Thanks a lot to everybody!

--
Francesco
___
Gnupg-users mailing list
Gnupg-users@gnupg.org
http://lists.gnupg.org/mailman/listinfo/gnupg-users


Re: Recommended key size for life long key

2013-09-08 Thread Ole Tange
On Sun, Sep 8, 2013 at 12:06 AM, Ingo Klöcker kloec...@kde.org wrote:
 On Saturday 07 September 2013 23:35:08 Ole Tange wrote:
 On Sat, Aug 31, 2013 at 11:46 AM, Ole Tange ta...@gnu.org wrote:
:
 http://oletange.blogspot.dk/2013/09/life-long-key-size.html
:
 but I'm pretty sure it's relevant for the
 battery life of your and your communication partners' smart phones. In
 particular, if you and your communication partners use equally large
 keys and encrypt each and every email, SMS, chat message, etc.

Assuming a new smartphone runs at 1 GHz with GnuPG 2.0 then
decryption+verify or sign+encryption will be in the order of 10
seconds if both sender and receiver use 10kbit keys. So we are talking
about 10 seconds per RSA encrypted message. Potentially lower if the
phone is multicore and GnuPG's RSA implementation supports
parallelized RSA operations.

If RSA is only used to negotiate the initial session key, then I would
reckon the 10 seconds is hardly noticeable from a battery perspective.
My old Nokia N900 with wifi on will let you sign+encryption 657
messages with 10kbit keys on a full battery using GnuPG 1.4.6. With
GnuPG 2.0 that would be in the order of 1000 messages per charge.

So where your concern really matters would be for high volume messages
 (100 per day or more) that are all RSA encrypted and are used on
battery operated slow devices. Apart from email, can you mention any
app that works like that today?

If I am to include the battery perspective and speculations on what
apps that _could_ be made, I should probably also include what would
happen if smartphones get a cryptochip included (which would bring RSA
operations into the millisecond range - thus rendering the battery
concern moot).


/Ole

___
Gnupg-users mailing list
Gnupg-users@gnupg.org
http://lists.gnupg.org/mailman/listinfo/gnupg-users


Re: Recommended key size for life long key

2013-09-08 Thread Ole Tange
On Sun, Sep 8, 2013 at 1:53 AM, Robert J. Hansen r...@sixdemonbag.org wrote:
 On 9/7/2013 5:35 PM, Ole Tange wrote:
 Feel free to let me know if you feel I have left out important concerns.
:
 You're projecting 87 years into the future.  Why should we have any
 confidence in your analysis?

The short answer: You do not have to trust projection to use the other
findings. If you have a better projection, use that instead.

The projection is based on www.win.tue.nl/~klenstra/key.pdf but I
think you are completely missing the point: The projection is not some
sort of guarantee that 10kbit keys will not be broken before 2100
(which is stressed by using the phrase 'It should be stressed that
this is only a guess'), but simply to give an estimate on what key
size we cannot given our knowledge _today_ say will be broken for
sure. Even if the guess proves to be wrong it will still be better than 4kbit
which seems fairly likely to be broken by 2100 (at least if no attack
is found that renders RSA completely useless). If you can come with a
better supported guess for the key length, that would be great.

Based on the guess that 10kbit has the potential of not being broken
within a person's life span: What problems would you experience if you
chose to use a 10kbit key today instead of a 4kbit key (which seems to
be the common choice - but which we are fairly certain will be broken
before 2100)? THIS (i.e. the problems by using 10kbit today instead of
4kbit) being the focus of the document.

So if you are focusing on the projection of the key size, help me
rephrase the text so you do not focus on that, because that has never
been the intended focus of the document.

 There are a large number of other errors in your write-up, but those two
 questions above are the most critical shortcomings.

Having now addressed that question, please elaborate what other errors you find.


/Ole

___
Gnupg-users mailing list
Gnupg-users@gnupg.org
http://lists.gnupg.org/mailman/listinfo/gnupg-users


Re: How can I send a public key exported with --armour option?

2013-09-08 Thread Pete Stephenson
On Sun, Sep 8, 2013 at 8:37 AM, Francesco C.
anything.everythin...@gmail.com wrote:
 Hello everybody, I'm a beginner of Gpg and encryption's world in general, so
 I apologize if my questions will be so banal.

Hi Francesco,

Welcome! No need to apologize! We're all pretty friendly here. :)

 I created a new pair of public-private keys and now I'm trying to export the
 public one.

Excellent.

 I read the How-To and it describe the more useful option --armour. I can't
 understand how to send a ASCII text print on my screen to an other user.

 Is not more useful the --output options?

You can add --armor (or --armour, I had no idea that GnuPG
supported the British spelling of the word. Interesting!) to
essentially any command that involves data being output. For
consistency I will use the spelling without the u, but both are
equivalent.

For example, if you created a key with the KeyID of KEYID, you could
export the public key for it to the terminal using gpg --export
--armor KEYID.

If you wish to export the public key to a text file which you can then
include in an email, post on the web, etc., you could use gpg
--export --armor KEYID  filename.txt where 'filename' is whatever
you wish the file to be called.

The armor feature is indeed quite useful, but it comes at a slight
cost: armored files/messages are slightly larger than their unarmored,
binary counterparts.

Cheers!
-Pete

___
Gnupg-users mailing list
Gnupg-users@gnupg.org
http://lists.gnupg.org/mailman/listinfo/gnupg-users


Re: How can I send a public key exported with --armour option?

2013-09-08 Thread Francesco S.


Pete Stephenson p...@heypete.com wrote:
On Sun, Sep 8, 2013 at 8:37 AM, Francesco C.


Hi Francesco,

Welcome! No need to apologize! We're all pretty friendly here. :)


I'm glad to know that. :-) 

You can add --armor (or --armour, I had no idea that GnuPG
supported the British spelling of the word. Interesting!) to
essentially any command that involves data being output. For
consistency I will use the spelling without the u, but both are
equivalent.


Yep, another thing... My English is scholastic, so not always I will be clear 
in my exposition, I also apologize of that. 
Anyway you are right, Gnupg recognize only armor :-) 

For example, if you created a key with the KeyID of KEYID, you could
export the public key for it to the terminal using gpg --export
--armor KEYID.

If you wish to export the public key to a text file which you can then
include in an email, post on the web, etc., you could use gpg
--export --armor KEYID  filename.txt where 'filename' is whatever
you wish the file to be called.

Perfect, this is what I needed! 

The armor feature is indeed quite useful, but it comes at a slight
cost: armored files/messages are slightly larger than their unarmored,
binary counterparts.

We are talking of some Kbytes, I think this cost will be absolutely sosteinable 
;-) 

Cheers!
-Pete

Thank you for All

Francesco 

--
Sent with my mobile phone  
 Android with K-9 Mail.

___
Gnupg-users mailing list
Gnupg-users@gnupg.org
http://lists.gnupg.org/mailman/listinfo/gnupg-users


Re: Recommended key size for life long key

2013-09-08 Thread Robert J. Hansen
On 9/8/2013 4:32 AM, Ole Tange wrote:
 The short answer: You do not have to trust projection to use the 
 other findings. If you have a better projection, use that instead.

I do, actually.  If I see that a major part of your write-up is
seriously lacking in rigor, that causes me to suspect the rest of your
write-up is similarly lacking.

 this is only a guess'), but simply to give an estimate on what key 
 size we cannot given our knowledge _today_ say will be broken for 
 sure.

We can't be sure 2048-bit keys will be broken by 2100.  Likewise, it's
within the realm of possibility 4096-bit keys will be broken tomorrow.

Factoring/discrete-log technology has stalled out for the last 20-odd
years after some very promising periods in the late-80s and early-90s.
The dominant technology used today is the General Number Field Sieve,
which was first developed around 1993.

This shouldn't really surprise us.  Factoring is *hard*.  It's provably
an NP problem, which means that (assuming P!=NP) there will never, ever,
ever, be an efficient algorithm for it [1].  We've been looking for
efficient ways to factor ever since Eratosthenes; it is quite likely
there simply isn't one.

So on the one hand, focusing on advances in factoring technology is
suspect.  And on the other hand, you're completely ignoring the
possibility of vast advances in other areas of number theory.  A couple
of years ago Dan Boneh published a paper that rocked a lot of people's
worlds to the core: he proved that *breaking RSA is not equivalent to
factoring* [2].  The RSA Conjecture (breaking RSA is equivalent to
factoring) is *false*.  Wow.  And since the long-term security of RSA
is predicated on the RSA Conjecture, and the idea there's no other way
to attack RSA than by factoring... along comes Dan Boneh and opens the
door to the possibility of there existing many other mathematical ways
to attack RSA.  Some of them will undoubtedly be worse than factoring.
Some of them may be better.  It's possible one of them might even be in
P.  And how do we project for the future when we cannot predict if,
when, one of these Boneh approaches will be developed?

I am not even scratching the surface of the difficulties involved in
long-term prediction.  I know exactly where my limitations lie in making
long-term predictions.  I doubt you have a better look on the future
than I do -- and given your write-up doesn't even address the factors
that make long-term prediction difficult, I have no confidence
whatsoever in your 87-year projection.

[1] An efficient *classical* algorithm for it.  Although factoring is in
NP, it's also in BQP, which means efficient quantum algorithms exist.

[2] http://crypto.stanford.edu/~dabo/abstracts/no_rsa_red.html

Further:

By 2100, I expect some kind of quantum computation to exist.  87 years
is a long time for technology: it took us fewer than seventy years to go
from the first airplane flight to Armstrong's first steps on the Moon.
It took fewer than thirty-five years to go from ENIAC to the Apple II.

Quantum computing, if-and-when it appears in a large scale (hundreds or
more of qubits in an ensemble), will transform our society in ways that
are hard to imagine.  It is literally science fiction technology.  Right
now, two of the few things we know for a fact is that quantum computers
will be able to efficiently factor large composites and compute discrete
logarithms in finite fields.  That means RSA and Elgamal are both out
the window the instant quantum computing becomes a possibility.

Your write-up barely mentions the possibility of quantum computing, and
says nothing of the consequences should it come to pass.  Even just an
I arbitrarily project that quantum computing will become possible in
2050 and mainstream by 2060 would be better, because at least you've
drawn a line on the map and said beyond 2060 there be dragons, and the
world will be radically unpredictable.

What do I mean by radically unpredictable?

Imagine that you're an academic in the early 1930s, and you're working
on an estimate of how much computation humanity has done.  You bury
yourself in studies of how many computers (remember, in the 1930s,
computer was an occupation) there are today, how much growth there has
been for the field, how many there were in the past, and you project
dramatic exponential growth for the profession of computing.  Then along
comes ENIAC which, in the first fifteen minutes of its operation, did
more computing than had been performed in the whole of human history up
to that point.  All of your estimates are immediately null and void
because the future has bushwhacked you.

*The very first quantum computer with more than two hundred qubits will,
in its very first calculation, do more computation than has been done by
all the world's computers from 1945 to the present.*

Anyone who attempts to predict the future of computing past the
introduction of quantum elements is either (a) a science fiction author
or (b) living in sin.

Re: Recommended key size for life long key

2013-09-08 Thread Ingo Klöcker
On Sunday 08 September 2013 10:29:18 Ole Tange wrote:
 On Sun, Sep 8, 2013 at 12:06 AM, Ingo Klöcker kloec...@kde.org 
wrote:
  On Saturday 07 September 2013 23:35:08 Ole Tange wrote:
  On Sat, Aug 31, 2013 at 11:46 AM, Ole Tange ta...@gnu.org wrote:
  
  http://oletange.blogspot.dk/2013/09/life-long-key-size.html
  
  but I'm pretty sure it's relevant for the
  battery life of your and your communication partners' smart phones.
  In particular, if you and your communication partners use equally
  large keys and encrypt each and every email, SMS, chat message,
  etc.
 Assuming a new smartphone runs at 1 GHz with GnuPG 2.0 then
 decryption+verify or sign+encryption will be in the order of 10
 seconds if both sender and receiver use 10kbit keys. So we are talking
 about 10 seconds per RSA encrypted message. Potentially lower if the
 phone is multicore and GnuPG's RSA implementation supports
 parallelized RSA operations.
 
 If RSA is only used to negotiate the initial session key, then I would
 reckon the 10 seconds is hardly noticeable from a battery
 perspective. My old Nokia N900 with wifi on will let you
 sign+encryption 657 messages with 10kbit keys on a full battery using
 GnuPG 1.4.6. With GnuPG 2.0 that would be in the order of 1000
 messages per charge.
 
 So where your concern really matters would be for high volume messages
 (100 per day or more) that are all RSA encrypted and are used on
 battery operated slow devices. Apart from email, can you mention any
 app that works like that today?

Some chat software (on PCs) uses GnuPG for encryption, but I'm not sure 
whether they use RSA only for the initial key exchange or for every chat 
message. Not having a smart phone I have no idea whether there are 
similar apps for smart phones.

Having said this, in view of Snowden's disclosures, there's definitely a 
need for such apps.


 If I am to include the battery perspective and speculations on what
 apps that _could_ be made, I should probably also include what would
 happen if smartphones get a cryptochip included (which would bring RSA
 operations into the millisecond range - thus rendering the battery
 concern moot).

Using a cryptochip might not only render the battery concern moot, but 
this whole discussion about life long keys because even a 1mbit RSA key 
is useless if the session keys created by the cryptochip are easily 
guessable by the NSA.


Regards,
Ingo


signature.asc
Description: This is a digitally signed message part.
___
Gnupg-users mailing list
Gnupg-users@gnupg.org
http://lists.gnupg.org/mailman/listinfo/gnupg-users


Re: Recommended key size for life long key

2013-09-08 Thread Hauke Laging
Am So 08.09.2013, 11:07:21 schrieb Robert J. Hansen:

Once more I feel enlightened (and I am sure I am not the only one). From time 
to time it seems appropriate to me that someone says thank you. So this time I 
do that.

-- 
Crypto für alle: http://www.openpgp-schulungen.de/fuer/bekannte/
OpenPGP: 7D82 FB9F D25A 2CE4 5241 6C37 BF4B 8EEF 1A57 1DF5


signature.asc
Description: This is a digitally signed message part.
___
Gnupg-users mailing list
Gnupg-users@gnupg.org
http://lists.gnupg.org/mailman/listinfo/gnupg-users


Re: How can I send a public key exported with --armour option?

2013-09-08 Thread Werewolf
On Sun, Sep 08, 2013 at 01:47:44PM +0200, Francesco S. wrote:
 
 
 Pete Stephenson p...@heypete.com wrote:
 On Sun, Sep 8, 2013 at 8:37 AM, Francesco C.
 
 
 Hi Francesco,
 
 Welcome! No need to apologize! We're all pretty friendly here. :)
 
 
 I'm glad to know that. :-) 
 
 You can add --armor (or --armour, I had no idea that GnuPG
 supported the British spelling of the word. Interesting!) to
 essentially any command that involves data being output. For
 consistency I will use the spelling without the u, but both are
 equivalent.
 
 
 Yep, another thing... My English is scholastic, so not always I will be clear 
 in my exposition, I also apologize of that. 
 Anyway you are right, Gnupg recognize only armor :-) 
 
 For example, if you created a key with the KeyID of KEYID, you could
 export the public key for it to the terminal using gpg --export
 --armor KEYID.
 
 If you wish to export the public key to a text file which you can then
 include in an email, post on the web, etc., you could use gpg
 --export --armor KEYID  filename.txt where 'filename' is whatever
 you wish the file to be called.
 
 Perfect, this is what I needed! 
 
 The armor feature is indeed quite useful, but it comes at a slight
 cost: armored files/messages are slightly larger than their unarmored,
 binary counterparts.
 
 We are talking of some Kbytes, I think this cost will be absolutely 
 sosteinable ;-) 
 
 Cheers!
 -Pete
 
 Thank you for All
 
 Francesco 

With a new key, wouldn't be that bad.  There's a 3k difference between my 
public key export and public key export with ascii armor.  Part of that is 
because it has 6 subkeys. (4 of which have expired)

Wolf


signature.asc
Description: Digital signature
___
Gnupg-users mailing list
Gnupg-users@gnupg.org
http://lists.gnupg.org/mailman/listinfo/gnupg-users


Re: Recommended key size for life long key

2013-09-08 Thread Filip M. Nowak
Hi

On 09/08/2013 05:07 PM, Robert J. Hansen wrote:
 On 9/8/2013 4:32 AM, Ole Tange wrote:
 The short answer: You do not have to trust projection to use the 
 other findings. If you have a better projection, use that instead.
 
 (...)
 We can't be sure 2048-bit keys will be broken by 2100.  Likewise, it's
 within the realm of possibility 4096-bit keys will be broken tomorrow.

Interesting comment for a sworn enemy of longer then default/hardcoded
key length :) (no provocation or trolling intended Robert)

Citing B. Schneier:

(...) If we think that's the case, the fix is easy: increase the key
lengths.*

 Factoring/discrete-log technology has stalled out for the last 20-odd
 years after some very promising periods in the late-80s and early-90s.
 The dominant technology used today is the General Number Field Sieve,
 which was first developed around 1993.
 
 This shouldn't really surprise us.  Factoring is *hard*.  It's provably
 an NP problem, which means that (assuming P!=NP) there will never, ever,
 ever, be an efficient algorithm for it [1].  We've been looking for
 efficient ways to factor ever since Eratosthenes; it is quite likely
 there simply isn't one.
 (...)

After Mr Schneier again:

Breakthroughs in factoring have occurred regularly over the past
several decades, allowing us to break ever-larger public keys. Much of
the public-key cryptography we use today involves elliptic curves,
something that is even more ripe for mathematical breakthroughs. It is
not unreasonable to assume that the NSA has some techniques in this area
that we in the academic world do not. Certainly the fact that the NSA is
pushing elliptic-curve cryptography is some indication that it can break
them more easily.**

And one more time:

If we think that's the case, the fix is easy: increase the key lengths.*

*, ** -
https://www.schneier.com/blog/archives/2013/09/the_nsas_crypto_1.html

Regards,
Filip M. Nowak

___
Gnupg-users mailing list
Gnupg-users@gnupg.org
http://lists.gnupg.org/mailman/listinfo/gnupg-users


Re: Recommended key size for life long key

2013-09-08 Thread Avi
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA256

As must I. Robert has one of the clearest modes of exposition from
which I have ever been fortunate to benefit.

- --Avi
-BEGIN PGP SIGNATURE-
Version: GnuPG v2.0.21 (MingW32)
Comment: Most recent key: Click show in box @ http://is.gd/4xJrs

iL4EAREIAGYFAlIszFZfGGh0dHA6Ly9rZXlzZXJ2ZXIudWJ1bnR1LmNvbS9wa3Mv
bG9va3VwP29wPWdldCZoYXNoPW9uJmZpbmdlcnByaW50PW9uJnNlYXJjaD0weDBE
NjJCMDE5RjgwRTI5RjkACgkQDWKwGfgOKflqQgD/cj8FdidQyWj3UqZ9kEjPnzTd
bzXp1b3hIAgpeUiGV7oA/jIQoveP0PBnNG773wyN/WaQtATIbKHKDyV9B9wUHTwu
=+VfZ
-END PGP SIGNATURE-



User:Avraham

pub 3072D/F80E29F9 1/30/2009 Avi (Wikimedia-related key) avi.w...@gmail.com
   Primary key fingerprint: 167C 063F 7981 A1F6 71EC ABAA 0D62 B019 F80E 29F9


On Sun, Sep 8, 2013 at 12:55 PM, Hauke Laging
mailinglis...@hauke-laging.de wrote:
 Am So 08.09.2013, 11:07:21 schrieb Robert J. Hansen:

 Once more I feel enlightened (and I am sure I am not the only one). From time
 to time it seems appropriate to me that someone says thank you. So this time I
 do that.

 --
 Crypto für alle: http://www.openpgp-schulungen.de/fuer/bekannte/
 OpenPGP: 7D82 FB9F D25A 2CE4 5241 6C37 BF4B 8EEF 1A57 1DF5

 ___
 Gnupg-users mailing list
 Gnupg-users@gnupg.org
 http://lists.gnupg.org/mailman/listinfo/gnupg-users


___
Gnupg-users mailing list
Gnupg-users@gnupg.org
http://lists.gnupg.org/mailman/listinfo/gnupg-users


Re: Recommended key size for life long key

2013-09-08 Thread Jean-David Beyer
On 09/08/2013 04:02 PM, Filip M. Nowak wrote:
[snip]
 Breakthroughs in factoring have occurred regularly over the past
 several decades, allowing us to break ever-larger public keys. Much of
 the public-key cryptography we use today involves elliptic curves,
 something that is even more ripe for mathematical breakthroughs. It is
 not unreasonable to assume that the NSA has some techniques in this area
 that we in the academic world do not. Certainly the fact that the NSA is
 pushing elliptic-curve cryptography is some indication that it can break
 them more easily.**
 
I would think the NSA would have two teams, that might work together at
times. One is interested in breaking the encryption of those they deem
to be enemies. The other is making encryption mechanisms that are as
difficult to break as they know how, for the use of our own secret
services, state department, and so on.

So perhaps the snooping division is pushing elliptic curve technology
because they have a technique for breaking those that they have not
published and that has not yet been leaked.

But the other division is developing some superior technique, such as
hyperbolic curves (I made that name up; it has nothing to do with
reality) that is at least an order of magnitude more difficult to break.
For use by any government agency that has secrets to keep but must
communicate from place to place, or from time to time. Some might need
public key encryption methods, some might manage with symmetric key methods.

-- 
  .~.  Jean-David Beyer  Registered Linux User 85642.
  /V\  PGP-Key:166D840A 0C610C8B Registered Machine  1935521.
 /( )\ Shrewsbury, New Jerseyhttp://counter.li.org
 ^^-^^ 16:55:01 up 10 days, 23:40, 3 users, load average: 4.76, 4.43, 4.30

___
Gnupg-users mailing list
Gnupg-users@gnupg.org
http://lists.gnupg.org/mailman/listinfo/gnupg-users


Re: Recommended key size for life long key

2013-09-08 Thread Leo Gaspard
On Sun, Sep 08, 2013 at 03:15:24PM -0400, Avi wrote:
 As must I. Robert has one of the clearest modes of exposition from
 which I have ever been fortunate to benefit.

I have to agree on this point.

The issue is that I disagree with him on his stance : in my opinion, having a
schedule stating when the keylengths will become insecure is useless : we just
have to be able to predict that longer keys will most likely be at least as hard
to crack.

And this means that, as long as the drawbacks associated with the use of the key
are assumed by the key owner only (as the tables state, encrypt and verify
operations being almost unchanged in time), preconizing 10kbit RSA keys is no
issue, and can only improve overall security, to the key owner's usability's
detriment at most.

And each key owner might choose whatever keylength they feel best suits them,
according to their usability / security needs ; as long as these choices do not
impede other keyholders' usability or security.

BTW, the statement [Dan Boneh] proved that breaking RSA is not equivalent to
factoring is wrong : he did not prove that breaking RSA is easier than
factoring numbers ; only that a whole ways of proving that breaking RSA is as
hard as factoring numbers are ineffective ; thereby reducing the search space
for potential valid ways of proving the conjecture. Hence the title of the
article : Breaking RSA *may* not be equivalent to factoring.
Please pardon me if I misunderstood the english used in the abstract.

Oh, and... Please correct me if I am mistaken, but I believe the best we can do
at the moment, even with a quantum computer is Shor's algorithm, taking a time
of O(log^3 n). Thus, going from 2k keys to 10k keys makes if approx. 125 times
harder to break. Sure, not so wonderful as what it is today, but if the constant
is large enough (which, AFAIK, we do not know yet), it might be a practical
attack infeasible for a few more years.

So, back to the initial question, I do not understand why the article should be
judged so poor. No, to me the article is not about predicting 87 years in front
of us. To me, the article is about stating that using 10k RSA keys is not as bad
as one might think at first. The gist of the article is, to me, precisely this
set of tables.

And the first part is, still in my opinion, only here to explain why 10k RSA
keys were chosen for the experiment. Explaining that, according to our current
estimates, they might potentially resist until 2100, assuming no major
breakthrough is made until then in the cryptanalysis field. You might notice
that such precautions are taken in the article too.

So... I find the article interesting. I would not have thought everyday use of
a 10k key would have so little drawbacks. And, finally, I want to recall that
signing keys need not be the same as certifying key, thus allowing us to lose
the signature time only for certifications, and use normal keys the rest of
the time ; thus getting the best of both worlds, by being able to upgrade
signing keys to stronger ones without losing one's WoT. The only remaining
drawback being when others certify our master key.

Cheers,

Leo

___
Gnupg-users mailing list
Gnupg-users@gnupg.org
http://lists.gnupg.org/mailman/listinfo/gnupg-users


Re: Recommended key size for life long key

2013-09-08 Thread Doug Barton

On 09/08/2013 02:00 PM, Leo Gaspard wrote:

And this means that, as long as the drawbacks associated with the use of the key
are assumed by the key owner only (as the tables state, encrypt and verify
operations being almost unchanged in time), preconizing 10kbit RSA keys is no
issue, and can only improve overall security, to the key owner's usability's
detriment at most.


The problem here is the knowledge sieve issue. Ole asked some 
questions and did some research, then filtered what he got back through 
a mixture of his preconceived notions, desires, etc. I'm not saying that 
he picked only the data that agreed with his desired conclusion, but he 
seems to have studiously ignored all of the facts that point to why what 
he's trying to do is a bad idea.


Now the next reader is going to come along, very likely someone who is 
more naive about encryption than Ole is, and filter that blog post 
through his own preconceptions, impatience, etc.; and come to the 
conclusion, If I make a 10k key it'll be safe for life! Has Ole done 
this reader a disservice?


I think the biggest disservice is a false sense of security. If your 
attacker can only pole vault 10 meters, and you already have a fence 
1,000 meters high, does a 100,000 meter fence make you any more secure? 
And what happens if your attacker develops a technique that is 
universally effective against all fences, no matter the height?


The flip side question is, What harm is there to using a 10k key? 
Aside from CPU and storage for the user of the key, everyone else in the 
community bears part of the cost when they have to verify a signature 
on an e-mail, for example. Is that a serious enough problem to cause us 
to wish no one would suggest the use of 10k keys? I don't know.


... and all of this is presupposing that his only a guess has any 
validity to it at all. I don't know the work by Arjen K. Lenstra and 
Eric R. Verheul that he based his graph on, and I have no idea if Ole's 
method of projecting key sizes out into the future is valid. What I DO 
know (and Robert emphasized this point in his first post), is that those 
authors, and other serious heavyweights in the crypto community have not 
felt comfortable doing what Ole has done. That fact alone should be 
enough reason for anyone not to take Ole's blog post seriously.


Doug


___
Gnupg-users mailing list
Gnupg-users@gnupg.org
http://lists.gnupg.org/mailman/listinfo/gnupg-users


Re: Recommended key size for life long key

2013-09-08 Thread Robert J. Hansen
On 9/8/2013 5:00 PM, Leo Gaspard wrote:
 BTW, the statement [Dan Boneh] proved that breaking RSA is not 
 equivalent to factoring is wrong : he did not prove that breaking 
 RSA is easier than factoring numbers ; only that a whole ways of 
 proving that breaking RSA is as hard as factoring numbers are 
 ineffective ; thereby reducing the search space for potential valid 
 ways of proving the conjecture. Hence the title of the article : 
 Breaking RSA *may* not be equivalent to factoring. Please pardon
 me if I misunderstood the english used in the abstract.

From the abstract, We provide evidence that breaking low-exponent RSA
cannot be equivalent to factoring integers ... an oracle for breaking
RSA does not help in factoring integers.

The title involves the word 'may' out of an abundance of caution and a
decent respect for the opinions of the mathematical community.  Whether
he's proven it or not is not really for him to claim, but for the
mathematical community.  He's provided a conjecture and strong evidence
to support it, and I feel his conjecture is well-proven.  Your mileage
may vary.  :)

 Oh, and... Please correct me if I am mistaken, but I believe the
 best we can do at the moment, even with a quantum computer is Shor's 
 algorithm, taking a time of O(log^3 n). Thus, going from 2k keys to 
 10k keys makes if approx. 125 times harder to break.

A factor of 125 is so small as to be irrelevant.

The real obstacle is that Shor's algorithm requires 2n qubits to be
formed in an ensemble, so you'd be going from requiring a 4k quantum
computer to a 20k quantum computer.  Given decoherence, that might
amount to a much more insurmountable obstacle.  Still, my money is on QC.

 And the first part is, still in my opinion, only here to explain why 
 10k RSA keys were chosen for the experiment. Explaining that, 
 according to our current estimates, they might potentially resist 
 until 2100, assuming no major breakthrough is made until then in the 
 cryptanalysis field.

The problem is the exact same thing can be said for RSA-2048.  RSA-2048
is expected to last at least the next 25 years; it may last for the next
century.  Hard to say.  Depends a lot on the progression of science
fiction level technologies, like quantum computation.  And again, anyone
trying to predict out past about 25 years needs to explain why they are
able to do this where NIST, RSA Data Security, and so many others have
failed to be able to do this.


___
Gnupg-users mailing list
Gnupg-users@gnupg.org
http://lists.gnupg.org/mailman/listinfo/gnupg-users