Re: SHA1 collision found

2017-02-23 Thread Christoph Anton Mitterer
On Thu, 2017-02-23 at 13:58 -0500, Robert J. Hansen wrote:
> > "Migrating to SHA256"
> section in
> the FAQ?

What I always kinda wonder is, why crypto or security experts, at least
in some sense never seem to learn.
When MD5 got it's first scratches, some people started to demanded for
it's ASAP retirement (which didn't happen... partially also with
arguments that it's not yet broken for these and that purposes in
practise)... in the end people waited so long until it was in a way
already too late.
Remember the forged MD5 based X509 cert? And this was made by some
"good guys" god know how many actual attacks may have been driven by
much stronger organisations where people actually were harmed in the
end.

SHA1 may have been phased out (more or less) in the X.509 world, but
it's still pretty present in many other places.
It's known to having issues for some years and for the same number of
years many experts still defended it as not being broken for these and
that use cases...
And now were again in the situation that it's still used in production
(probably for years to come), and we have at least a collision.
That may not be the one big fire alert where everything burns down...
but it should be really a ringing bell...


Now every time when new algos come up or e.g. when ideas for the next
OpenPGP version is started,.. a big bunch of experts seem to go for the
most conservative way possible. And I'm not talking about the good
conservatism (i.e. using algos based on long standing and well
understood math)... but rather things like let's better not use SHA512
or SHA3 when we could also just use SHA256... let's better not specify
large curves when we can go by a much smaller one.

And every time the same argument is brought up, that these would be
still way enough to take hundreds of years to be cracked... but so far
(as with SHA1) it was always broken much earlier.


The last time when I followed discussion about the next OpenPGP it
seemed people rather wanted to hard-wire only a few algos for
everything, which would be just the same problem as with SHA1,...
instead all algos should be pretty easily exchangeable.
So when the same happens for the next OpenPGP version just with SHA256
I'll bet that we face the same problems with SHA256 far earlier than
everyone wishes.


Not to talk about the more and more realistic threat posed by quantum
computers.


IMO we should rather go for the stronger algos, or even combine algos
when this makes sense because their underlying math is different that
breaking one would still not directly affect the other.
And we should rather make any crypto algo as easily exchangeable as
possible.


Cheers,
Chris.

___
Gnupg-users mailing list
Gnupg-users@gnupg.org
http://lists.gnupg.org/mailman/listinfo/gnupg-users


Re: SHA1 collision found

2017-02-23 Thread sivmu
Am 23.02.2017 um 20:09 schrieb ved...@nym.hush.com:
> The Openpgp standards group is working on this.

Yes but who know how many years it will take until a new standard is accepted...

>
> The link you give for the collision used 2 PDF's.
> Using a PDF is sort-of 'cheating', and does not extrapolate to being
> 'completely broken'.
>
> Assuming that it is possible to find a pre-image collision, i.e:
>
> [1] m1.txt 1 has an SHA1 hash of H1
> [2] m2.txt will now have the same SHA1 hash H1
>
> What will happen to in order to generate m2.txt  is that there will be
> many trials of a gibberrish string added to the plaintext of m2.txt
> until one is found that has the same SHA1 hash as m1.txt
> BUT
> This will be quite visible in the plaintext of m2.txt, and won't fool
> anyone.
>
> With a PDF, the 'extra gibberish string' is 'hidden'. It is not in the
> actual PDF the receiver reads, only in the meta-data, the appended PDF
> 'Suffix'.

Not sure about you but I am not able to see the difference between a valid pgp 
key and "gibberish" ;)


>
> While this is *do-able* and a good reason to move on to a future
> SHA256 hash, it would not be transferable (at this time, based on the
> PDF collision data), to find a fingerprint collision for any v4 key.
> vedaal
The question is how many tries it takes until a colliding key is found that is 
accepted by common pgp implementations when imported, is it not?


As said, if it is as easy as i think it is, providing an option for different 
hash algos to generate fingerprints would be a nice solution until a new 
standard is established.

___
Gnupg-users mailing list
Gnupg-users@gnupg.org
http://lists.gnupg.org/mailman/listinfo/gnupg-users


OpenPGP third-party certifications do not imply trust [was: Re: Announcing paperbackup.py to backup keys as QR codes on paper]

2017-02-23 Thread Daniel Kahn Gillmor
[ not on-topic for this thread, hence the subject change ]

On Thu 2017-02-23 05:00:54 -0500, Gerd v. Egidy wrote:
>> The certificate (aka public key) includes all signatures, all the data
>> on the keyserver. It's data you don't really need to back up since it is
>> public, and it can be huge. My key.asc file is 137,424 bytes following
>> your instructions.
>
> Seems you are trusted by much more people than me ;)

I'm calling this out because it's a common misconception, and i don't
want it to lie here unchallenged when someone is browsing the archives.

The people who "sign your key" (who have created an OpenPGP
certification that binds your primary key to your User ID) are only
identifying you and your key.  They have said nothing about "trust" by
making those certifications.

For example, I am happy to certify the identity and key of someone who i
do not trust at all, as long as i know who they are and they've asserted
their key to me in-person, or across some reliable, non-forgeable
transport.

So the fact that Alice has a dozen certifications on her key and Bob has
two doesn't mean that Alice is trusted by more people than Bob at all.
It just means that more people have been willing to publicly assert that
they know Alice's identity and key than have been willing to publicly
assert the same information about Bob.

--dkg

___
Gnupg-users mailing list
Gnupg-users@gnupg.org
http://lists.gnupg.org/mailman/listinfo/gnupg-users


Re: SHA1 collision found

2017-02-23 Thread vedaal


On 2/23/2017 at 1:27 PM, si...@web.de wrote:Today was announced that
SHA1 is now completely broken
https://security.googleblog.com/2017/02/announcing-first-sha1-collision.html

A few weeks back it was mentioned that there is a new proposal for a
openpgp standart including a new algorithm for pgp fingerprints.
As this is currently not applicable in practice, I would like to know
what this new development means for pgp-gnupg and the use of SHA1 for
key identification.

After researching how the fingerprint is generated, I think it would
be easy to include a new option in gnupg to print a fingerprint using
sha256. Would that be something that will/can be included in future
versions of gnupg

=

The Openpgp standards group is working on this.

The link you give for the collision used 2 PDF's.
Using a PDF is sort-of 'cheating', and does not extrapolate to being
'completely broken'.

Assuming that it is possible to find a pre-image collision, i.e:

[1] m1.txt 1 has an SHA1 hash of H1
[2] m2.txt will now have the same SHA1 hash H1

What will happen to in order to generate m2.txt  is that there will be
many trials of a gibberrish string added to the plaintext of m2.txt
until one is found that has the same SHA1 hash as m1.txt
BUT
This will be quite visible in the plaintext of m2.txt, and won't fool
anyone.

With a PDF, the 'extra gibberish string' is 'hidden'. It is not in the
actual PDF the receiver reads, only in the meta-data, the appended PDF
'Suffix'.

While this is *do-able* and a good reason to move on to a future
SHA256 hash, it would not be transferable (at this time, based on the
PDF collision data), to find a fingerprint collision for any v4 key.
vedaal
___
Gnupg-users mailing list
Gnupg-users@gnupg.org
http://lists.gnupg.org/mailman/listinfo/gnupg-users


RE: SHA1 collision found

2017-02-23 Thread Robert J. Hansen
(I originally sent this off-list by mistake.  Peter was kind enough to respond 
off-list and to suggest we take it back on-list.  This email is a distillation 
of three different emails: my original, Peter's response, and a response to 
Peter.)

=

> I already answered that here[1]. The use of SHA-1 in fingerprints is 
> not susceptible to a collision attack, so it's still safe. SHA-1 in 
> fingerprints is only susceptible to a second-preimage attack which is 
> much harder than a collision attack and unheard of for SHA-1.

To which I said, "Create two keys with the same fingerprint.  Sign a contract 
with one, then renege on the deal.  When you get called into court, say "I 
never signed that, Your Honor!" and present the second key.  This collision 
pretty much shatters the nonrepudiability of SHA-1 signatures."

To which Peter quite reasonably answered that the other person has a copy of 
the public key which was used to sign the document originally.  Why should the 
fraudster's denial be believed?

The answer is that to enforce a contract (at least here in the United States) 
you must be able to prove, based on a preponderance of the evidence, that the 
other person entered into a contract with you.  So imagine this conversation:

PLAINTIFF: "Your Honor, the defendant reneged on a $10,000 contract.  Make him 
pay up."
DEFENDANT: "I never signed anything, Your Honor."
PLAINTIFF: "I have his key, it's right here."
DEFENDANT: "That's not my key.  This is my key."
PLAINTIFF: "Of course that's what he claims!  They have the same SHA-1 
fingerprint!  He did that in order to deny his signature!"
JUDGE: "So these keys are uniquely identified by the fingerprint?"
(both parties agree)
JUDGE: "And you have two keys that are identified by the same fingerprint?"
(both parties agree)
JUDGE: "And there's no way to tell which key is real?"
(both parties agree)
JUDGE: "Then we're stuck.  There's no reason to prefer one key over another.  
Plaintiff, you have failed your burden of proof in establishing the defendant 
signed the contract."

Now, you could establish proof some other way: let's say you made a videotape 
of the defendant signing the document.  If you could introduce other supporting 
evidence (which might include other signatures on keys) you might be able to 
convince the judge the signature is enforceable.  But there's nothing intrinsic 
to the signature itself which could convince the judge.

So Peter is completely right to say "but there's no reason to believe one 
person over the other."  Completely, absolutely right.  But the person asking 
the court to enforce a contract must present a reason to believe them over the 
defendant.

I hope this clarifies my answer!

(Peter also rightly remarked that he thought nonrepudiability in OpenPGP was 
kind of iffy anyway.  He and I are in complete agreement on this.  OpenPGP has 
always had very iffy nonrepudiability.  With this SHA-1 attack, I feel the 
threshold has been crossed and we need to consider it repudiable.)



___
Gnupg-users mailing list
Gnupg-users@gnupg.org
http://lists.gnupg.org/mailman/listinfo/gnupg-users


Re: Announcing paperbackup.py to backup keys as QR codes on paper

2017-02-23 Thread Daniel Kahn Gillmor
On Thu 2017-02-23 03:54:12 -0500, Thomas Jarosch wrote:
> In the interest of humanity and the cause of science, I've just tried again 
> with a throwaway key :) This time it worked just fine. The "only" thing 
> that's 
> changed is that I've upgraded from Fedora 22 to Fedora 25 since I last tried.

humanity and science thank you for your efforts :)

happy hacking,

 --dkg

___
Gnupg-users mailing list
Gnupg-users@gnupg.org
http://lists.gnupg.org/mailman/listinfo/gnupg-users


RE: SHA1 collision found

2017-02-23 Thread Robert J. Hansen
> Today was announced that SHA1 is now completely broken
> https://security.googleblog.com/2017/02/announcing-first-sha1-
> collision.html

SHA-1 is broken *for some purposes*.  That's scary enough, trust me.  Let's
not overstate things.

For the last ten years I've been saying, "The smoke alarm has gone off and
we think there's a fire.  There's no danger to anyone right now, but we need
to move to the exits in an orderly fashion.  Start migrating away from SHA-1
right now, so that when the collisions happen you've already been using
SHA256 for years."

Today we've seen the fire.  It's not surprising.  We knew this was coming,
we just didn't know when.  If you're still using SHA-1, you probably need to
begin migrating *right now* before the fire gets worse.  If you don't know
how, ask on this list and we'll help you.  But don't panic: we can help.

A question for the list: should we put a "Migrating to SHA256" section in
the FAQ?



___
Gnupg-users mailing list
Gnupg-users@gnupg.org
http://lists.gnupg.org/mailman/listinfo/gnupg-users


Re: SHA1 collision found

2017-02-23 Thread Peter Lebbing
On 23/02/17 19:24, si...@web.de wrote:
> As this is currently not applicable in practice, I would like to know
> what this new development means for pgp-gnupg and the use of SHA1 for
> key identification.

I already answered that here[1]. The use of SHA-1 in fingerprints is not
susceptible to a collision attack, so it's still safe. SHA-1 in
fingerprints is only susceptible to a second-preimage attack which is
much harder than a collision attack and unheard of for SHA-1.

> After researching how the fingerprint is generated, I think it would
> be easy to include a new option in gnupg to print a fingerprint using
> sha256. Would that be something that will/can be included in future
> versions of gnupg?

It wouldn't help because of all the places SHA-1 is used internally if
you just change how it is displayed to the user. Disclaimer: I'm not a
developer, but this is my understanding of it. I can't say for sure.

HTH,

Peter.

[1] 

-- 
I use the GNU Privacy Guard (GnuPG) in combination with Enigmail.
You can send me encrypted mail if you want some privacy.
My key is available at 



signature.asc
Description: OpenPGP digital signature
___
Gnupg-users mailing list
Gnupg-users@gnupg.org
http://lists.gnupg.org/mailman/listinfo/gnupg-users


SHA1 collision found

2017-02-23 Thread sivmu
Today was announced that SHA1 is now completely broken
https://security.googleblog.com/2017/02/announcing-first-sha1-collision.html

A few weeks back it was mentioned that there is a new proposal for a openpgp 
standart including a new algorithm for pgp fingerprints.
As this is currently not applicable in practice, I would like to know what this 
new development means for pgp-gnupg and the use of SHA1 for key identification.

After researching how the fingerprint is generated, I think it would be easy to 
include a new option in gnupg to print a fingerprint using sha256. Would that 
be something that will/can be included in future versions of gnupg?

That way users could publish both the sha1 and sha256 finderprint in the future.

___
Gnupg-users mailing list
Gnupg-users@gnupg.org
http://lists.gnupg.org/mailman/listinfo/gnupg-users


Re: Announcing paperbackup.py to backup keys as QR codes on paper

2017-02-23 Thread NdK
Il 23/02/2017 11:00, Gerd v. Egidy ha scritto:

> If we are talking centuries, I'd worry about the availability of gnupg as 
> much 
> as qrcodes. Both are publicly available standards, but I don't know if they 
> are still available and understandable by then. I'd recommend going to 
> plaintext on glass etched microfiche if you really want to cover that 
> timespan.
Well, when considering such a timespan there could be other (bigger)
issues... How long a today 'secure' key will remain secure? When will
quantum computers be widely available?
The only "guaranteed" crypto is information-theoretic one (neural
networks mutual learning, distant noisy sources, etc), where adversary's
probability of success is a function of the system parameters. But it's
quite impractical and AFAIK covers only interactive key agreement.

PS: in 100 years surely I won't (be here to) care if someone will be
able to read my mails or not :)

BYtE,
 Diego

___
Gnupg-users mailing list
Gnupg-users@gnupg.org
http://lists.gnupg.org/mailman/listinfo/gnupg-users


Re: Problems with cert validation via CRL

2017-02-23 Thread David Gray
Thanks very much for getting back to me - I really appreciate your help.  I 
have been able to get the validation to work by adding the trusted root 
certificate to the "trusted-certs" folder under the gnupg directory on my 
windows box.  The directory wasn't there but I was able to add it and as long 
as the cert is there dirmngr knows that it can trust the CRL that has been 
issued.  I haven't had a chance to circle back on my Linux installation, but 
I'm sure the same approach will work.  I'm also not sure how/why the Linux 
installation was originally able to validate the cert, but I will dig into 
that.  

Thanks again for your help - it's very much appreciated!

Sent from my Mobile Device

> On Feb 21, 2017, at 9:31 PM, NIIBE Yutaka  wrote:
> 
> Hello, again,
> 
> David Gray  wrote:
>> dave@dave-VirtualBox:~/.gnupg/crls.d$ dirmngr --debug-all --fetch-crl 
>> http://crl.comodoca.com/COMODOSHA256ClientAuthenticationandSecureEmailCA.crl
> 
> Reading the code of dirmngr, I think that --fetch-crl (or dirmngr-client
> --load-crl) doesn't work well for a CRL which is not signed by system CA
> directly.  When dirmngr doesn't know the issuer, it inquires back to the
> client, and it fails as:
> 
>> dirmngr[3184.0]: DBG: find_cert_bysubject: certificate not returned by 
>> caller - doing lookup
>> dirmngr[3184.0]: error fetching certificate by subject: Configuration error
>> dirmngr[3184.0]: CRL issuer certificate 
>> {92616B82E1A2A0AA4FEC67F1C2A3F7B48000C1EC} not found
>> dirmngr[3184.0]: crl_parse_insert failed: Missing certificate
> 
> When it is gpgsm which asks dirmngr to validate a certificate, I think
> it works.
> 
> I think that you once successfully did that on this box:
> 
>> dave@dave-VirtualBox:~/.gnupg/crls.d$ gpgsm --debug-all --list-keys 
>> --with-validation
> 
> And the CRL is cached.  Thus,
> 
>> gpgsm: DBG: chan_6 -> ISVALID 
>> 685A02B9E2BD4B5EE1FA51739B8882AEA38FB3C8.3FAADAD7DD3F946B114321153B76F88C
> 
> This is gpgsm asking if your X.509 client certificate is valid or not.
> 
>> gpgsm: DBG: chan_6 <- INQUIRE ISTRUSTED 
>> 02FAF3E291435468607857694DF5E45B68851868
> 
> Here, I think that the CRL for your X.509 client certificate is cached
> and checked.  dirmngr does not ask about anything about your X.509
> client certificate or its issuer.
> 
> dirmngr inquires back to gpgsm if the root issuer is trusted.
> 
>CN=AddTrust External CA Root,OU=AddTrust External TTP Network,O=AddTrust 
> AB,C=SE
>fingerprint=02FAF3E291435468607857694DF5E45B68851868
> 
> then, gpgsm asks to gpg-agent.
> 
>> gpgsm: DBG: chan_7 -> ISTRUSTED 02FAF3E291435468607857694DF5E45B68851868
>> gpgsm: DBG: chan_7 <- S TRUSTLISTFLAG relax
>> gpgsm: DBG: chan_7 <- OK
> 
> It is trusted.  Then, gpgsm replies back to dirmngr.
> 
>> gpgsm: DBG: chan_6 -> D 1
>> gpgsm: DBG: chan_6 -> END
> 
> It's trusted.
> 
>> gpgsm: DBG: chan_6 <- OK
> 
> Then, dirmngr answers OK for the validation of your X.509 client certificate.
> 
>> gpgsm: DBG: chan_6 -> ISVALID 
>> 14673DA5792E145E9FA1425F9EF3BFC1C4B4957C.00E023CB1512835389AD616E7A54676B21
> 
> This is gpgsm asking if the intermediate certificate of following is
> valid or not:
> 
>CN=COMODO SHA-256 Client Authentication and Secure Email CA,O=COMODO CA 
> Limited,
>L=Salford, ST=Greater Manchester, C=GB
>fingerprint=59B825FC08860B04B392CC25FEC48C760753B689
> 
>> gpgsm: DBG: chan_6 <- INQUIRE ISTRUSTED 
>> 02FAF3E291435468607857694DF5E45B68851868
>> gpgsm: DBG: chan_7 -> ISTRUSTED 02FAF3E291435468607857694DF5E45B68851868
>> gpgsm: DBG: chan_7 <- S TRUSTLISTFLAG relax
>> gpgsm: DBG: chan_7 <- OK
>> gpgsm: DBG: chan_6 -> D 1
>> gpgsm: DBG: chan_6 -> END
>> gpgsm: DBG: chan_6 <- OK
> 
> Similar interactions between gpg-agent<->gpgsm<->dirmngr.
> 
>> gpgsm: DBG: chan_7 -> ISTRUSTED 02FAF3E291435468607857694DF5E45B68851868
>> gpgsm: DBG: chan_7 <- S TRUSTLISTFLAG relax
>> gpgsm: DBG: chan_7 <- OK
> 
> I don't know the exact reason, but gpgsm again asks gpg-agent.
> 
> And gpgsm shows your X.509 client certificate:
> 
>>   ID: 0x2F5900E9
>>  S/N: 3FAADAD7DD3F946B114321153B76F88C
>>   Issuer: /CN=COMODO SHA-256 Client Authentication and Secure Email 
>> CA/O=COMODO CA Limited/L=Salford/ST=Greater Manchester/C=GB
>>  Subject: /EMail=u...@domain.com
>>  aka: u...@domain.com
>> validity: 2017-01-02 00:00:00 through 2018-01-02 23:59:59
>> key type: 2048 bit RSA
>>key usage: digitalSignature keyEncipherment
>> ext key usage: emailProtection (suggested), 1.3.6.1.4.1.6449.1.3.5.2 
>> (suggested)
>> policies: 1.3.6.1.4.1.6449.1.2.1.1.1:N:
>>  fingerprint: 4A:53:A9:E6:51:32:23:DF:B4:7D:B8:A3:19:F1:3E:A3:2F:59:00:E9
>>  [Note: non-critical certificate policy not allowed]
>>  [Note: non-critical certificate policy not allowed]
>>  [validation model used: shell]
>>  [certificate is good]
> 
> On the other hand, on your Windows...
> 
>> C:\Users\dave\Downloads>gpgsm --list-keys --with-validation 

Re: Announcing paperbackup.py to backup keys as QR codes on paper

2017-02-23 Thread Gerd v. Egidy
> You might consider using a font designed for OCR rather than the current
> font.

I tried to change to OCR-B or Inconsolata 
http://stackoverflow.com/questions/316068/what-is-the-ideal-font-for-ocr

but getting that to work with enscript is not easy, as you have to find and 
install afm and pfb into the correct directories. This goes a lot deeper than 
just installing packages provided by whatever disto you are using.

So I think that this would move the bar for a possible user of paperbackup.py 
higher than I want to.
 
> Additionally, base64 has look-alike characters, and the only checksum is
> for the whole key. So if it says "checksum failed" you've only learned
> that factoid. A checksum per line would be better, so you can say
> "checksum failed in line n".

Can you recommend a tool to create a short checksum (crc32?) for each line? 

Ideally it is a tool or combination of tools already deployed widely, like sed 
and sort I used in paperrestore. This would make the checksums still usable 
even when the source to paperbackup.py isn't available anymore.

Kind regards,

Gerd


___
Gnupg-users mailing list
Gnupg-users@gnupg.org
http://lists.gnupg.org/mailman/listinfo/gnupg-users


Re: Announcing paperbackup.py to backup keys as QR codes on paper

2017-02-23 Thread Gerd v. Egidy
> I'm a little
> surprised that your code is as large as it is, too: using an alternate
> pipeline you might be able to significantly reduce code size.
> 
> (a) use Python 3's gpg module to export the secret key
> (b) paperkey --output-type raw --secret-key key.gpg --output key.raw

I want paperbackup.py to be independent and agnostic of gnupg. It should also 
be usable for e.g. ssh keys or ciphertext.

> (c) use Python 3's QR library to create a series of PNGs
> (d) use Wand or PythonMagick to convert the PNGs to PDF
> (e) save the PDF and you're done

I had some problems creating proper multipage pdfs, so I used PyX.

If you can come up with shorter sourcecode or less dependencies, I'm happy to 
take patches.

Kind regards,

Gerd


___
Gnupg-users mailing list
Gnupg-users@gnupg.org
http://lists.gnupg.org/mailman/listinfo/gnupg-users


Re: Announcing paperbackup.py to backup keys as QR codes on paper

2017-02-23 Thread Gerd v. Egidy
Hi Peter,

> The certificate (aka public key) includes all signatures, all the data
> on the keyserver. It's data you don't really need to back up since it is
> public, and it can be huge. My key.asc file is 137,424 bytes following
> your instructions.

Seems you are trusted by much more people than me ;)

> $ gpg2 --armour --output key.asc --export-options export-minimal
> --export-secret-key [KEYID]

Thank you for your explanation and recommendation. I have adapted the readme 
on github.

> However, I'm running into a little problem here myself... GnuPG 2.1.18
> does not respect the "export-minimal" option for --export-secret-key,
> only for --export. So if you are using GnuPG 2.1, this will not work as
> intended.
> 
> This is in all likelihood a bug in GnuPG 2.1, and I'll report it right now.

Thank you for checking and reporting this.

As it will not leave out important information, just add more data that is not 
strictly needed, it won't hurt the affected users much. Just a few more dead 
trees...

> Oh, as an aside, the advantage of paperkey is that it is
> self-describing. No matter what happens, as long as we can still use
> hexadecimal digits to describe binary content (which would be trivial to
> reimplement), we can reconstruct the binary private key file. Using QR
> codes has the disadvantage that if you cannot find a QR-code decoder for
> your platform in the future, reimplementing one is far from trivial. You
> are dependent on QR codes surviving as an actually supported data format.

What timespan are we talking about?

If we are talking decades, I have no doubts that some qrcode decoder will 
still be available, even if qrcodes aren't used anymore. There are several 
open source decoders available and included in linux distributions. Stuff like 
that tends to be available for a long time: you can still download packaged 
linux distros like Red Hat Linux 1.0 (released 1995) or Debian 0.91 (released 
1994) today, about 23 years afterwards.

If we are talking centuries, I'd worry about the availability of gnupg as much 
as qrcodes. Both are publicly available standards, but I don't know if they 
are still available and understandable by then. I'd recommend going to 
plaintext on glass etched microfiche if you really want to cover that 
timespan.

> Finally, I remember something about QR codes inherently supporting
> splitting data over multiple individual code blocks, specifically for
> data that is too large to fit in a single block. I don't know if it
> supports the number of blocks you need, but you might want to check it
> out.

I know of that feature and have deliberately decided against it:

Not all decoders are capable of it, and if one qrcode is missing, the linking 
is broken and you have to patch the decoder to still get some data.

I consider the plaintext linking and ordering I used more robust, see
https://github.com/intra2net/paperbackup#encoding-and-data-format

> Also, you say large QR codes are easily damaged by wrinkles and
> deformations. Is this perhaps related to the amount of error correction
> data included? You can tune the ratio of content data to error
> correction data, making a QR code more resilient to damage.

I used the largest error correction ratio possible.

> However, if
> you find that it is not unreadable individual pixels but rather the
> deformation of the total that is throwing off decoders, than I suppose
> the ratio doesn't help: it either can reduce it to the required square
> or it cannot, I'd think.

I haven't studied the decoding algorithms at that level of detail. If the 
deformation is irregular, I guess it affects some parts of a code more than 
others. Then a higher error correction ration will help.

Kind regards,

Gerd


___
Gnupg-users mailing list
Gnupg-users@gnupg.org
http://lists.gnupg.org/mailman/listinfo/gnupg-users