On Monday, 10 October 2016 at 03:15:07 UTC, sarn wrote:
End users won't want to permute+encrypt multiple times unless you can show there's a benefit in the speed/security tradeoff, so you'll need to think about that in the design.

The largest portion would be that much like a hash, one small change will change the entire thing rather than a smaller portion (with the original blocksize). The multiple re-arranging and encryption steps is to ensure small changes affects every other block it was part of.

Just thinking that if someone makes a database of say the first 4 bytes expected in a file format (like gzip, bzip2, others, etc) then they can map most of the keys and immediately know how to decrypt it (assuming it's of a particular file/stream type). The larger block size also allows for multiple keys so you could push past far past the limitations of a single block cipher.

As for a specific example, not really. Something fairly small, so personal documents and the like or archives, unlike say multi-media where it doesn't contain any personal data (probably). The only other idea is another multi-step process used for when generating hashes/keys or the like which is to slow down or make it annoyingly difficult to brute force passwords from a hashfile. Alternatively with the salting having it for encrypted communication would help hide sentences/replies where you reply the same thing over and over again.

Y>Do you have the stuff?
M>Yes
Y>Did you stash it in the place?
M>Yes
Y>Do you like Lasagna?
M>Yes

or something like that :P

Oh well. My question was mostly an idea, having something to look over for block ciphers will be an interesting read (when I get to it)

Reply via email to