Ido,
Perhaps you should hire Bruce Schneier or a similar expert. Or read his books, which cover this topic in depth and are quite understandable to any well-educated software engineer. Also, see his web site and blog for much useful information. My own personal take is that these requirements, while perhaps well-motivated, are misguided and naïve. The whole point of periodically standardizing on new encryption algorithms is to compensate for the increase in computing power (on the one hand) and the increase in theoretical knowledge about weaknesses in existing encryption algorithms (on the other hand). There are many aspects to creating and maintaining the secrecy and integrity of encrypted data over time. In my opinion, the choice of an encryption algorithm, while important, is by no means the end of the design and implementation process. I am also of the opinion that the clause that refers to the price of the key-breaking hardware should be ignored. Anyone can create a supercomputer now by simply using crowd-sourcing techniques on the public internet, at no (or at least low) cost. Frankly, in my opinion, you need an algorithm that will withstand a much more formidable attack than the customer has considered. Also, nothing in these requirements speaks to performance. Since the speed of various public encryption algorithms can vary by at least two orders of magnitude, the choice of an appropriate algorithm must take performance into consideration. As for keeping up with the expected improvements in knowledge and performance, the simple answer is that you can monitor the actions of the IETF, ISO, IEEE, NIST, and other standardization bodies. For knowledge about known risks, follow CERT or similar alerting organizations. The safe answer is "go hire an expert". PG From: owner-openssl-...@openssl.org [mailto:owner-openssl-...@openssl.org] On Behalf Of Ido Regev Sent: Monday, March 11, 2013 7:09 AM To: openssl-dev@openssl.org Subject: RE: Question on encryption algorithms brittleness Hi, I haven't found a reply to the specific question the customer is asking me. Any other direction will be greatly appreciated. Ido From: owner-openssl-...@openssl.org [mailto:owner-openssl-...@openssl.org] On Behalf Of Jason Gerfen Sent: Wednesday, March 06, 2013 4:29 PM To: openssl-dev@openssl.org Subject: Re: Question on encryption algorithms brittleness NIST has more details. http://csrc.nist.gov/publications/PubsFIPS.html See FIPS 200 (Minimum guidelines), FIPS 198--1 (HMAC), FIPS 197 (AES, symmetric algorithms) & FIPS 185 (PKI escrow) On Wed, Mar 6, 2013 at 7:15 AM, Matt Caswell <fr...@baggins.org> wrote: This site would be a good place to start: http://www.keylength.com/ Matt On 6 March 2013 13:56, Ido Regev <ido.re...@ecitele.com> wrote: We have a requirement from one of our customers regarding the encryption algorithms - "Make use of published public encryption algorithms that are considered to be practically unbroken. Contracting Authority considers an algorithm practically unbroken when a key can't be recovered within 1 year with hardware costing less than 1,000,000 euro. We should have a life cycle process for the encryption algorithms in place to ensure the 1 year duration is kept despite the every increase computing power. Describe the process." We would greatly appreciate if you could help us with this question. Best regards, Ido