Re: combining entropy
On Sat, 25 Oct 2008, John Denker wrote: On 10/25/2008 04:40 AM, IanG gave us some additional information. Even so, it appears there is still some uncertainty as to interpretation, i.e. some uncertainty as to the requirements and objectives. I hereby propose a new scenario. It is detailed enough to be amenable to formal analysis. The hope is that it will satisfy the requirements and objectives ... or at least promote a more precise discussion thereof. We start with a group comprising N members (machines or persons). Each of them, on demand, puts out a 160 bit word, called a "member" word. We wish to combine these to form a single word, the "group" word, also 160 bits in length. If you are interested in something with a formal analysis, you should check out work on (single-source or multiple-source) extractors. - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
Re: combining entropy
John Denker wrote: > On 09/29/2008 05:13 AM, IanG wrote: >> My assumptions are: >> >> * I trust no single source of Random Numbers. >> * I trust at least one source of all the sources. >> * no particular difficulty with lossy combination. > > >> If I have N pools of entropy (all same size X) and I pool them >> together with XOR, is that as good as it gets? > > Yes. > > The second assumption suffices to prove the result, > since (random bit) XOR (anything) is random. unless you have a possible case where (say) for any given pool, alternate bits are predictable; XORing all 'n' pools would still give a maximum entropy of 50%, as the XOR of all 'n' predictable bits is predictable. using a hash which performs error diffusion, I would expect that 'n' equal to 3 would give a suitable combined stream in that case (assuming the 50% of random bits *are* random of course) 2 is possibly good enough, but I would probably over-engineer at 3, in case one pool went non-random. - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
Cryptologic History Symposium: Call for Papers
Forwarded with permission. --- From: "Sieg, Kent G" <[EMAIL PROTECTED]> Subject: Symposium Call for Papers Date: Mon, 27 Oct 2008 10:23:50 -0400 Just sending notice of our upcoming Symposium, especially if you can present or know of a colleague who would like to do so. Dr. Kent Sieg The Center for Cryptologic History announces a call for papers for its biennial Symposium on Cryptologic History. The Symposium will occur on 15-16 October 2009 in Laurel, Maryland, at the Johns-Hopkins Applied Physics Laboratory located in the Baltimore-Washington corridor. The theme for the Symposium will be "Global Perspectives on Cryptologic History". We will consider all proposals relating to any aspect of cryptologic history. The deadline for submission of proposals, to include a minimum two-page topic prospectus, a brief source list, and a biography, is 10 January 2009. Selected presenters will received notification by 1 March 2009. For further information, contact Dr. Kent Sieg, Symposium coordinator, at 301-688-2336 or [EMAIL PROTECTED] --Steve Bellovin, http://www.cs.columbia.edu/~smb - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
Rubber-hose cryptanalysis?
http://news.cnet.com/8301-13739_3-10069776-46.html?tag=mncol --Steve Bellovin, http://www.cs.columbia.edu/~smb - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
Re: Who cares about side-channel attacks?
Peter Gutmann wrote: > In fact none of the people/organisations I queried about this fitted into any > of the proposed categories, it was all embedded devices, typically SCADA > systems, home automation, consumer electronics, that sort of thing, so it was > really a single category which was "Embedded systems". Given the string of > attacks on crypto in embedded devices (XBox, iPhone, iOpener, Wii, some > not-yet-published ones on HDCP devices :-), etc) this is by far the most > at-risk category because there's a huge incentive to attack them, the result > affects tens/hundreds of millions of devices, and the attacks are immediately > and widely actively exploited (modchips/device unlocking/etc, an important > difference between this and academic proof-of-concept attacks), so this is > the > one where I'd expect the vendors to care most. But they've all been unlocked using easier attacks, surely? -- http://www.apache-ssl.org/ben.html http://www.links.org/ "There is no limit to what a man can do or how far he can go if he doesn't mind who gets the credit." - Robert Woodruff - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
Cloning resistance in bluetooth
Suppose one has a system that automatically signs you on to anything if your cell phone is within bluetooth range of your computer, and automatically signs you off out of everything, and puts up a screen saver that will not go away, when your cell phone is out of range of your computer. What is the basis for cloning resistance of a cell phone with blue tooth? NFC provides physical authenticity - privacy on the model of whispering one's ear, and authentication by touching. Is there any mechanism intended for mapping that to keys, so that when two NFC devices meet, they can give each other petnames, and subsequently recognize public keys by petname? - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
Re: data rape once more, with feeling.
"Usability research" about how to track web users? How Google-like. Can't you just dump a 25-year cookie on them from twelve different directions, and be done with it? > Federated Login has been a "holy grail" in the identity community > for a long time. We have known how to do the technical part for a > long time. However the industry has constantly tried, and failed, > to find a model that was (1) simple for end users, and (2) had a > reasonable trust model between the RP (the relying party, which is > the site you want to log into) and the IDP (the identity provider, > who will identify you to the RP). Explicitly ignoring the trust model between the end users and the RP, and the trust between the end users and the IDP. Why should end users trust your web site? Why should they trust an IDP like Google? It's not that every website that requires a login is a privacy swamp. But the big ones pretty much all are, and those are the ones who want to impose this new model without bothering the end user's little head about whether he should trust them. And if every little wiki that just uses logins to slightly limit spam today, began using "federated identity", then ALL of them would become privacy swamps. > For example, the site might require users to agree to a Terms of Service. Let's see an example of how you're automating how the USER might require the SITE to agree to a Terms of Service. Doesn't seem to be part of the model, which is that the SITE has something valuable it needs lawyers to protect, while the USER is just an eyeball-with-attached-wallet to be sold to the highest bidder. > When users are presented with a traditional signup page that asks > for E-mail, password, & password confirmation, it is quite common > for 30-50% of users to not finish the process. I wonder why not! Perhaps they do not want to be tracked, numbered, wiretapped, monitored, herded, logged, datamined, folded, spindled, and mutilated. Perhaps they just want to look at a web site without tying their reading habits to their social security number and their medical records. Similar percentages describe how many people lie through their teeth to get into random websites. So half won't even login, half of those will lie like hell; a quarter of the people either think you're trustworthy, or are too stupid to care. Which fraction is "federated identity" aimed at? Catching the liars, i.e. fencing in the people who actually take care to protect their privacy? Yeah, those are the guys this community wants you to screw as hard as you can. :-( > In this scenario, buy.com could detect that the domain name is for > an IDP that it trusts. It could then redirect the user to AOL to > verify their identity. Assuming the user approves sharing their > identity, then the user will be redirected back to buy.com which can > automatically create an account for them, and log them in. That's an interesting assumption. Why would you assume that AOL would give users the choice? AOL is not famous for choice. Wouldn't AOL just read the user's 25-year AOL cookie, and redirect the browser back to buy.com with full account information supplied, without any interaction with the user at all? AOL could probably even charge the RP a few bucks for doing so. How simple. How evil. Franchising your privacy violations. > The RP can control what IDPs it trusts, and even switch their users > back to legacy logins if the IDP becomes untrustworthy You can pretty well guarantee that RP websites will somehow decline to trust any IDP that provides privacy to the end user -- like mailinator.com, for example. A few web sites that "send you an email to verify you are who you say you are" already blacklist mailinator, though it's usually easy to bypass the blacklist by using one of its alternative domain names. John - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
Re: combining entropy
John Denker <[EMAIL PROTECTED]> wrote: > To say the same thing in more detail: Suppose we start > with N generators, each of which puts out a 160 bit word > containing 80 bits of _trusted_ entropy. That's a 50% > entropy density. So you need a 2:1 or heavier compression that won't lose entropy. If you just need one 160 word out per N in, then hashing them is the obvious way to do that. > We next consider the case where N-1 of the generators have > failed, or can no longer be trusted, ... > XOR is provably correct because it is _reversible_ in the > thermodynamic sense. That means it cannot increase or > decrease the entropy. Yes, but the proof holds for any reversible mapping. XOR makes each output bit depend on exactly two inputs bits. Sometimes you want a mapping that mixes them better. If one input is entirely random, XOR is fine; random ^ x is random for any x. It is also fine in the case above, where only one generator works. If > 1 inputs have some entropy but none have enough, which seems to me the commonest case, XOR is not the best choice; it does not mix well enough. Nyberg's perfect s-boxes are in some ways the ideal mixer. 2n bits in, n out, all columns and all linear combinations of columns are bent functions. Big S-boxes are expensive though, and building even small Nyberg S-boxes is going to take significant effort. Designing something that uses a bunch of say 8 by 4 S-boxes to do good mixing on 160-bit chunks is not trivial either. You could use IDEA multiplication in mixing. Two 16-bit words in, one out, and every output bit depends on all input bits. If every 16-bit input word has 50% entropy density (not the same as every 160-bit word does, but perhaps close enough) then the output should have 100%. For N > 1, you need to combine those and worry about overall mixing. If entropy density is known to be ~50%, you can combine pairs with IDEA to get ~100%, then use cheaper operations for any other mixing needed. I'd use addition, which costs about the same as XOR but gives slightly better mixing because of carries. For N > 2 and density < 50%, you could use a cascade of IDEA operations 8->4->2->1 or whatever. Or do something like: combine two 160-bit chunks with 10 IDEA multiplications, circular shift the result 8 bits, combine with next 160-bit input, ... At some point, you may find yourself designing a hash. If that happens, just give up and use a standard hash. -- Sandy Harris, Quanzhou, Fujian, China - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
Re: combining entropy
Alas on 10/25/2008 01:40 PM, I wrote: > To summarize: In the special sub-case where M=1, XOR > is as good as it gets. In all other cases I can think > of, the hash approach is much better. I should have said that in the special sub-case where the member word has entropy density XX=100% _or_ in the special sub-case where M=1, XOR is as good as it gets. In all other cases I can think of, the hash approach is much better. (I excluded the XX=100% case earlier in the note, but I should have included it in the summary. Sorry.) - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
Re: combining entropy
On 10/25/2008 04:40 AM, IanG gave us some additional information. Even so, it appears there is still some uncertainty as to interpretation, i.e. some uncertainty as to the requirements and objectives. I hereby propose a new scenario. It is detailed enough to be amenable to formal analysis. The hope is that it will satisfy the requirements and objectives ... or at least promote a more precise discussion thereof. We start with a group comprising N members (machines or persons). Each of them, on demand, puts out a 160 bit word, called a "member" word. We wish to combine these to form a single word, the "group" word, also 160 bits in length. We must find a combining function. The primary objective is to maximize the entropy of the group word. A secondary objective is computational simplicity. The members can be categorized as follows: a) Some of them are abjectly broken. Their outputs have zero entropy density. Constant outputs and/or predictable outputs fall into this category. b) Some of them are malicious. Their outputs may appear random, but are in fact predictable by our adversary. c) M of them have an entropy density greater than XX. As a concrete example, we consider the case where XX=50%, i.e. 80 bits of entropy in a 160 bit word. d) Some of them could contain a high entropy density, very close to 100%. For our example, we assume there are none of these; otherwise the problem would be too easy. If we do things properly, case (b) is no worse than case (a), for reasons that will become apparent shortly, so we can lump these cases together. We don't know which generator falls into which category. All we need to know is that M of the generators are putting out useful entropy. I recommend the following combining function: concatenate all N of the member words, and then feed that through a hash function to produce the group word. Since SHA-1 is efficient and has a 160 bit output word, it will serve nicely in our example. In the sub-case where M=1, the recommended hash-based procedure produces a group word with 80 bits of entropy, i.e. a 50% entropy density, which is the best we can do. In this sub-case, SHA-1 is no better than XOR. As M increases, the entropy density of the output word converges rather rapidly to 100%. This is subject to mild assumptions about the hash function actually working as a hash function, i.e. not being grossly broken. When M is greater than 1, the hash function approach is much better than the XOR approach. Here is an easy proof: Consider the case where each member in category (c) puts out a 160 bit word consisting of 80 totally random bits in the left half and 80 constant bits in the right half. XORing these together only gets you to 80 bits of entropy in the group word, whereas hashing is better by a factor of 2. Actually (2 minus epsilon) if you want to be fussy about it. In the case where the entropy is evenly distributed within the member word, i.e. 160 bits each with a 50% entropy density, the result is more subtle: The group word will converge to 100% entropy density, but the hash version converges _faster_ than the XOR version. Here "faster" means you can get by with a smaller M. Considerably smaller. Also (!) beware that to get XOR to converge at all, this paragraph depends on some properties of the members that may be hard to realize in practice ... whereas the hash approach has no such dependence. = To summarize: In the special sub-case where M=1, XOR is as good as it gets. In all other cases I can think of, the hash approach is much better. My analysis applies to a specific set of requirements. If somebody wants to discuss other requirements, please be specific about what the requirements are. There are innumerable creeping features that we could discuss. - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
Re: combining entropy
On Sat, Oct 25, 2008 at 12:40 PM, IanG <[EMAIL PROTECTED]> wrote: > Jonathan Katz wrote: >> I think it depends on what you mean by "N pools of entropy". > > > I can see that my description was a bit weak, yes. Here's a better > view, incorporating the feedback: > > If I have N people, each with a single pool of entropy, > and I pool each of their contributions together with XOR, > is that as good as it gets? I think you need to define what you mean by "as good as it gets". Clearly XOR loses entropy that might be there, so on the measure of "good == most entropy", it is not. - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]