Re: solving the wrong problem
'chindogu' seems almost appropriate but maybe not exact http://www.designboom.com/history/useless.html http://www.pitt.edu/~ctnst3/chindogu.html --Anton - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
Re: solving the wrong problem
Dave Howe wrote: > > "Nonsense fence" maybe less metaphoric but more clear. > I disagree - "one picket fence" gives a clear impression of a protective > device > that is hardened at but one point - leaving the rest insecure. "nonsense > fence" > doesn't give any real image. Perhaps, but sometimes rubbish just better be named rubbish without any metaphorical allusions. For everyone's good. -- Ilya Levin http://www.literatecode.com - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
Re: solving the wrong problem
John Denker wrote: > That's an interesting topic for discussion, but I don't think > it answers Perry's original question, because there are plenty > of situations where the semblence of protection is actually a > cost-effective form of security. It's an example of statistical > deterrence. i've frequently used a metaphor about a bank vault door installed in the middle of an open field. http://www.garlic.com/~lynn/aadsm15.htm#9 Is cryptography where security took the wrong branch? http://www.garlic.com/~lynn/2002l.html#12 IEEE article on intelligence and security http://www.garlic.com/~lynn/2003h.html#26 HELP, Vulnerability in Debit PIN Encryption security, possibly http://www.garlic.com/~lynn/2003n.html#10 Cracking SSL the other metaphor is the one about if all you have is a hammer, then all problems become nails. and for some of the PKI related ... frequently they start out claiming the answer is PKI ... before asking what the problem is. one of the current issues is that some financial operations are using a value for a userid-like capability and at the same time using the same value as a password-like capability. userid requires fairly high security integrity ... aka from PAIN * privacy * authentication * integrity * non-repudiation and the userid capability also requires fairly general availability in order to establish permissions and as the basis for other business operations. however, the password capability requires very high privacy and confidentiality. the result is relatively high diametrically opposing use critiaria ... high integrity and generally available ... vis-a-vis high confidentiality. pure encryption might claim that they could meet the high confidentialilty requirements ... but that then tends to break all the "generally available" requirements for its userid function (and/or esposing it in the clear for all its business use operations creates enormous number of points for the value to leak out) the fundamental threat model then turns out not to be there isn't enuf encryption ... the fundamental threat model is a dual-use compromise ... where the same information is being used to select permissions (aka userid) and needs to be generally available ... while at the same time serving as a password (for authentication). - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
Re: solving the wrong problem
Peter Fairbrother <[EMAIL PROTECTED]> writes: >Peter Gutmann wrote: >> Peter Fairbrother <[EMAIL PROTECTED]> writes: >>> Didn't the people who did US/USSR nuclear arms verification do something >>> very similar, except the characterised surface was sparkles in plastic >>> painted on the missile rather than paper? >> >> Yes. The intent was that forging the fingerprint on a warhead should cost as >> much or more than the warhead itself. > >Talking of solving the wrong problem, that's a pretty bad metric - forging >should cost the damage an extra warhead would do, rather than the cost of an >extra warhead. That's got to be in the trillions, rather than a few hundred >thousand for another warhead. The cost was US$12M per warhead. I think that's sufficient. Peter. - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
Re: solving the wrong problem
Perry E. Metzger writes: > Anyone have a good phrase in mind that has the right sort of flavor > for describing this sort of thing? Well, I've always said that crypto without a threat model is like "cookies without the milk". -- --My blog is at blog.russnelson.com | In a democracy the rulers Crynwr sells support for free software | PGPok | are older versions of the 521 Pleasant Valley Rd. | +1 315-323-1241 | popular kids from high Potsdam, NY 13676-3213 | | school. --Bryan Caplan - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
Re: solving the wrong problem
On Tue, Aug 09, 2005 at 01:04:10AM +1200, Peter Gutmann wrote: > That sounds a bit like "unicorn insurance" > [..] > However, this is slightly different from what Perry was suggesting. > There seem to be at least four subclasses of problem here: > > 1. "???" : A solution based on a misunderstanding of what the real problem is. > > 2. "Unicorn insurance": A solution to a nonexistent problem. > > 3. "???": A solution to a problem created artificially in order to justify its >solution (or at least to justify publication of an academic paper >containing a solution). > > 4. "PKI": A solution in search of a problem. Nice list, and terms for the remaining ??? cases would be nice, but I'm not sure that any of these captures one essential aspect of the problem Perry mentioned, at least as I see it. One of the nice aspects of the "snake oil" description is the implications it has about the dodgy seller, rather than the product. To my view, much of the Quantum Cryptography (et al) discussion has this aspect: potentially very cool and useful technology in other circumstances, but being sold into a market not because they particularly need it, but just because that's where the money is. Certainly, that's the aspect I find most objectionable, and thus deserving of a derogatory term, rather than just general frustration at naive user stupidity. None of the terms proposed so far capture this aspect. The specific example given doesn't quite fit anywhere on your list. It's somewhere between #3 and #4; perhaps it's a #4 with a dodgy salesman trying to push it as a #3 until a better problem is found for it to solve? I was going to suggest "porpoise oil" (from "not fit-for-purpose"), but how about "unicorn oil" - something that may well have some uncertain magical properties, but still sold under false pretenses, and not really going to cure your ills? -- Dan. pgp6yaUOzDOti.pgp Description: PGP signature
Re: solving the wrong problem
Peter Gutmann wrote: > Peter Fairbrother <[EMAIL PROTECTED]> writes: >> Perry E. Metzger wrote: >>> Frequently, scientists who know nothing about security come up with >>> ingenious ways to solve non-existent problems. Take this, for example: >>> >>> http://www.sciam.com/article.cfm?chanID=sa003&articleID=00049DB6-ED96-12E7-A >>> D9 >>> 683414B7F >>> >>> Basically, some clever folks have found a way to "fingerprint" the >>> fiber pattern in a particular piece of paper so that they know they >>> have a particular piece of paper on hand. >> >> Didn't the people who did US/USSR nuclear arms verification do something >> very similar, except the characterised surface was sparkles in plastic >> painted on the missile rather than paper? > > Yes. The intent was that forging the fingerprint on a warhead should cost as > much or more than the warhead itself. Talking of solving the wrong problem, that's a pretty bad metric - forging should cost the damage an extra warhead would do, rather than the cost of an extra warhead. That's got to be in the trillions, rather than a few hundred thousand for another warhead. -- Peter Fairbrother - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
locking door when window is open? (Re: solving the wrong problem)
"Single picket fence" -- doesn't work without a lot of explaining. The one I usually have usually heard is the obvious and intuitive "locking the door when the window is open". (ie fixating on quality of dead-bolt, etc on the front door when the window beside it is _open_!) Adam On Sat, Aug 06, 2005 at 04:27:51PM -0400, John Denker wrote: > Perry E. Metzger wrote: > > >We need a term for this sort of thing -- the steel tamper > >resistant lock added to the tissue paper door on the wrong vault > >entirely, at great expense, by a brilliant mind that does not > >understand the underlying threat model at all. > > > >Anyone have a good phrase in mind that has the right sort of flavor > >for describing this sort of thing? > > In a similar context, Whit Diffie once put up a nice > graphic: A cozy little home protected by a picket fence. > he fence consisted of a single picket that was a mile > high ... while the rest of the perimeter went totally > unprotected. > > So, unless/until somebody comes up with a better metaphor, > I'd vote for "one-picket fence". - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
Re: solving the wrong problem
Adam Shostack <[EMAIL PROTECTED]> writes: >Let me propose another answer to Perry's question: > "Wearing a millstone around your neck to ward off vampires." > >This expresses both ends of a lose/lose proposition: > -- a burdensome solution > -- to a fantastically unimportant problem. That sounds a bit like "unicorn insurance" ("We've taken out insurance against unicorns, and we know that it's effective because we haven't been attacked by any unicorns yet"), which is used for silly threat models. However, this is slightly different from what Perry was suggesting. There seem to be at least four subclasses of problem here: 1. "???" : A solution based on a misunderstanding of what the real problem is. 2. "Unicorn insurance": A solution to a nonexistent problem. 3. "???": A solution to a problem created artificially in order to justify its solution (or at least to justify publication of an academic paper containing a solution). 4. "PKI": A solution in search of a problem. Peter. - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
Re: solving the wrong problem
Peter Fairbrother <[EMAIL PROTECTED]> writes: >Perry E. Metzger wrote: >> Frequently, scientists who know nothing about security come up with >> ingenious ways to solve non-existent problems. Take this, for example: >> >> http://www.sciam.com/article.cfm?chanID=sa003&articleID=00049DB6-ED96-12E7-AD9 >> 683414B7F >> >> Basically, some clever folks have found a way to "fingerprint" the >> fiber pattern in a particular piece of paper so that they know they >> have a particular piece of paper on hand. > >Didn't the people who did US/USSR nuclear arms verification do something >very similar, except the characterised surface was sparkles in plastic >painted on the missile rather than paper? Yes. The intent was that forging the fingerprint on a warhead should cost as much or more than the warhead itself. Then the Soviet Union collapsed, and the unforgeable fingerprints were replaced by magic markers, which were cheaper to manage. Peter. - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
Re: solving the wrong problem
Adam Shostack wrote: Here's a thought: "Putting up a beware of dog sign, instead of getting a dog." That's an interesting topic for discussion, but I don't think it answers Perry's original question, because there are plenty of situations where the semblence of protection is actually a cost-effective form of security. It's an example of statistical deterrence. Look at it from the attacker's point of view: If a fraction X of the beware-of-dog signs really are associated with fierce dogs, while (1-X) are not, *and* the attacker cannot tell which are which, and there are plenty of softer targets available, the attacker won't risk messing with places that have signs, because the downside is just too large. The fraction X doesn't need to be 100%; even a smallish percentage may be a sufficient deterrent. OTOH of course if the sign-trick catches on to the point where everybody has a sign, the sign loses all value. We can agree that the dog-sign is not a particularly good application of the idea of statistical enforcement, because there are too many ways for the attacker to detect the absence of a real dog. A better example of statistical deterrence is traffic law enforcement. The cops don't need to catch every speeder every day; they just need to catch enough speeders often enough, and impose sufficiently unpleasant penalties. The enforcement needs to be random enough that would-be violators cannot reliably identify times and places where there will be no enforcement. Statistical enforcement (if done right) is *not* the same as "security by obscurity". This is relevant to cryptography in the following sense: I doubt cryptological techniques alone will ever fully solve the phishing problem. A more well-rounded approach IMHO would include "sting" operations against the phishers. Even a smallish percentage chance that using phished information would lead to being arrested would reduce the prevalence of the problem by orders of magnitude. = Let me propose another answer to Perry's question: "Wearing a millstone around your neck to ward off vampires." This expresses both ends of a lose/lose proposition: -- a burdensome solution -- to a fantastically unimportant problem. This is related to the anklets on the White Knight's horse, "to guard against the bites of sharks" ... with added emphasis on the burdensomeness of the solution. - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
Re: solving the wrong problem
Perry E. Metzger wrote: > > Frequently, scientists who know nothing about security come up with > ingenious ways to solve non-existent problems. Take this, for example: > > http://www.sciam.com/article.cfm?chanID=sa003&articleID=00049DB6-ED96-12E7-AD9 > 683414B7F > > Basically, some clever folks have found a way to "fingerprint" the > fiber pattern in a particular piece of paper so that they know they > have a particular piece of paper on hand. Didn't the people who did US/USSR nuclear arms verification do something very similar, except the characterised surface was sparkles in plastic painted on the missile rather than paper? -- Peter Fairbrother - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
Re: solving the wrong problem
Here's a thought: "Putting up a beware of dog sign, instead of getting a dog." On Sun, Aug 07, 2005 at 09:10:51PM +0100, Dave Howe wrote: | Ilya Levin wrote: | >John Denker <[EMAIL PROTECTED]> wrote: | > | >>So, unless/until somebody comes up with a better metaphor, | >>I'd vote for "one-picket fence". | > | > | >"Nonsense fence" maybe less metaphoric but more clear. | I disagree - "one picket fence" gives a clear impression of a protective | device that is hardened at but one point - leaving the rest insecure. | "nonsense fence" doesn't give any real image. | | - | The Cryptography Mailing List | Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED] - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
Re: solving the wrong problem
Ilya Levin wrote: John Denker <[EMAIL PROTECTED]> wrote: So, unless/until somebody comes up with a better metaphor, I'd vote for "one-picket fence". "Nonsense fence" maybe less metaphoric but more clear. I disagree - "one picket fence" gives a clear impression of a protective device that is hardened at but one point - leaving the rest insecure. "nonsense fence" doesn't give any real image. - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
Re: solving the wrong problem
John Denker <[EMAIL PROTECTED]> wrote: > So, unless/until somebody comes up with a better metaphor, > I'd vote for "one-picket fence". "Nonsense fence" maybe less metaphoric but more clear. -- - Ilya O Levin http://www.literatecode.com - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
Re: solving the wrong problem
On Sat, 6 Aug 2005, Perry E. Metzger wrote: > We already have the term "snake oil" for a very different type of bad > security idea, and the term has proven valuable for quashing such > things. We need a term for this sort of thing -- the steel tamper > resistant lock added to the tissue paper door on the wrong vault > entirely, at great expense, by a brilliant mind that does not > understand the underlying threat model at all. > > Anyone have a good phrase in mind that has the right sort of flavor > for describing this sort of thing? Chief Security Officer comes to mind... > Perry -- Yours, J.A. Terranson [EMAIL PROTECTED] 0xBD4A95BF I like the idea of belief in drug-prohibition as a religion in that it is a strongly held belief based on grossly insufficient evidence and bolstered by faith born of intuitions flowing from the very beliefs they are intended to support. don zweig, M.D. - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
Re: solving the wrong problem
When I came to Washington DC last november, my portrait and fingerprints were taken for the first time. I was the last one in the queue and the immigration officer was a nice guy, so I asked him how this should protect against terrorists. As far as I read in the newspapers, the 911 attackers just came under their real identity with their own passports. He smiled and told me, that this is not about terrorism. It is about illegal immigrants. A complete criminal infrastructure has established. As soon as my passport is stolen or if I lose it, they will have someone who looks similar as me and tries to enter the US with my passport. The problem is that they do not modify or temper with the passport in any way. The officers do not have any chance to detect any flaw with the passport, since it is still an authentic one. Their problem is not detecting forged passports, their problem is whether the passport belongs to the person. That's why they are taking fingerprints and pictures. Once the owner of a passport entered the USA and is in the database, they can detect if someone else is trying to enter with the same passport. Detection of the fiber structure wouldn't help here. regards Hadmut - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
Re: solving the wrong problem
Reminds me of the White Knight from Alice in Wonderland, who doesn't understand his threat model, and doesn't know how to effectively use his tools: `I see you're admiring my little box,' the Knight said in a friendly tone. `It's my own invention -- to keep clothes and sandwiches in. You see I carry it upside-down, so that the rain ca'n't get in.' `But the things can get out,' Alice gently remarked. `Do you know the lid's open? `I didn't know it,' the Knight said, a shade of vexation passing over his face. `Then all the things must have fallen out! And the box is no use without them.'' ... `You see,' he went on after a pause, `it's as well to be provided for every-thing. That's the reason the horse has all those anklets round his feet.' `But what are they for?' Alice asked in a tone of great curiosity. `To guard against the bites of sharks,' the Knight replied. `It's an invention of my own.' Full text from the chapter: http://www.sabian.org/Alice/lgchap08.htm alien "Perry E. Metzger" writes: >We already have the term "snake oil" for a very different type of bad >security idea, and the term has proven valuable for quashing such >things. We need a term for this sort of thing -- the steel tamper >resistant lock added to the tissue paper door on the wrong vault >entirely, at great expense, by a brilliant mind that does not >understand the underlying threat model at all. - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
Re: solving the wrong problem
Perry E. Metzger wrote: > A variant on the moviefone.com model might work better for these folks > -- have the person buy the tickets with a credit card, and use a > machine to check that they are in physical possession of said card > when they enter the theater. Most people will not loan their cards to > strangers, so the model works reasonably well without excess amounts > of supervision by humans. some of the work 5-6 years ago on self-serve boarding pass machines were along the same lines ... it would have to be the same card as used to purchase the ticket. now it is any card that can be related to you and your name ... since there is still somebody at the head of the line that checks that the name on the boarding pass (that was just printed) is the same name on some gov. issued picture card. - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
Re: solving the wrong problem
"Steven M. Bellovin" <[EMAIL PROTECTED]> writes: > Tickets are an excellent use for this, because it binds the printing to > a specific physical object. The concert industry has had a problem > with trying to use print-at-home tickets -- the fraudsters buy a single > ticket, then print it multiple times and sell the resulting tickets to > others. One group is resorting to requiring ID at the door -- buyers > will never have a physical ticket until after they're escorted inside, > to eliminate the opportunity for such fraud. (See > http://www.nytimes.com/2005/08/06/arts/music/06scal.html for more > details.) The threat model is slightly more complex than that. The industry doesn't want people reselling real tickets, either, which is one reason that physical objects aren't enough. In fact, the NY Times article you cite mentions people being physically escorted in to prevent resale as well as duplication. > Yes, you could do everything via an online system based on identity > documents. Apart from the privacy implications, and the problem of > coping with network failures just prior to the start of a concert or > game, dealing with the multiple forms of ID people carry isn't easy; it > requires a fair amount of preparation and infrastructure. As I said, > people may be moving in that direction, but the article itself called > the scheme "laborious"; the band's manager called it "unbelievably > cumbersome". A variant on the moviefone.com model might work better for these folks -- have the person buy the tickets with a credit card, and use a machine to check that they are in physical possession of said card when they enter the theater. Most people will not loan their cards to strangers, so the model works reasonably well without excess amounts of supervision by humans. > I don't disagree with Perry's basic statement -- that a lot of people > try to solve the wrong problem. Here, though, we have a tool. It > remainds to be determined if it's a hammer, screwdriver, or wrench, and > hence what problems to apply it to. Oh, sure, I think it may be a fine tool, but it is a very narrow tool, and possibly a hard one to use. It might make sense for offline authentication of printed bearer financial instruments (like currency) based on a digital signature on the "fingerprint" information and similar stuff. My problem is with the claimed use in identity documents, which seems entirely wrongheaded... Perry - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
Re: solving the wrong problem
In message <[EMAIL PROTECTED] nk.net>, John Kelsey writes: > >On the other hand, think about the uses of this technology >for paper bearer instruments. Design travelers' checks that >include a 2D barcode with a BLS signature, bound to the >piece of paper, and you can print the damned thing on >regular paper if the readers are cheap enough. Similar >things apply to stamps, tickets, etc. Tickets are an excellent use for this, because it binds the printing to a specific physical object. The concert industry has had a problem with trying to use print-at-home tickets -- the fraudsters buy a single ticket, then print it multiple times and sell the resulting tickets to others. One group is resorting to requiring ID at the door -- buyers will never have a physical ticket until after they're escorted inside, to eliminate the opportunity for such fraud. (See http://www.nytimes.com/2005/08/06/arts/music/06scal.html for more details.) Yes, you could do everything via an online system based on identity documents. Apart from the privacy implications, and the problem of coping with network failures just prior to the start of a concert or game, dealing with the multiple forms of ID people carry isn't easy; it requires a fair amount of preparation and infrastructure. As I said, people may be moving in that direction, but the article itself called the scheme "laborious"; the band's manager called it "unbelievably cumbersome". I don't disagree with Perry's basic statement -- that a lot of people try to solve the wrong problem. Here, though, we have a tool. It remainds to be determined if it's a hammer, screwdriver, or wrench, and hence what problems to apply it to. --Steven M. Bellovin, http://www.cs.columbia.edu/~smb - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
Re: solving the wrong problem
>From: "Perry E. Metzger" <[EMAIL PROTECTED]> >Sent: Aug 6, 2005 2:28 PM >To: cryptography@metzdowd.com >Subject: solving the wrong problem >Frequently, scientists who know nothing about security come >up with ingenious ways to solve non-existent problems. Take >this, for example: >http://www.sciam.com/article.cfm?chanID=sa003&articleID=00049DB6-ED96-12E7-AD9683414B7F >Basically, some clever folks have found a way to "fingerprint" the >fiber pattern in a particular piece of paper so that they know they >have a particular piece of paper on hand. It is claimed that this >could help stop forged passports. >Unfortunately, the invention is wholely useless for the > stated purpose. A couple of these guys gave a talk at NIST recently. The thing is, I can think of a bunch of uses for the thing they're doing. This looks genuinely useful as a tool. Whether they've worked out how to use the tool to best effect is a different question. The passport idea doesn't add much, as you pointed out. The reason is that the thing you care about there is that the information on the passport hasn't been tampered with and originated from the right source. An identical copy of my passport is no worse than the original. On the other hand, think about the uses of this technology for paper bearer instruments. Design travelers' checks that include a 2D barcode with a BLS signature, bound to the piece of paper, and you can print the damned thing on regular paper if the readers are cheap enough. Similar things apply to stamps, tickets, etc. If you can get readers into peoples' homes, you can even allow home printing of tickets, travelers' checks, etc., each bound to a specific piece of paper. Add a reader to your favorite DVD player platform (I think it's the same basic hardware as is used in a DVD player), and you can uniquely sign content on a disc, and use the player's hardware to enforce only playing content when the disc's biometric matches the signed content. You could use the technique to scan small bits of flat surfaces of all your stuff (the basic technique works on paper, plastic, and metal, at least; I'm not sure if it works on wood or glass), record the biometrics and locations of the scans, and provide this to the police when your house gets burgled. There are some wonderful potential uses for this technology in making paper-based voting systems *much* more secure. And on and on. If I were in the business of producing tamper-resistant paper, I'd be scared to death. ... >Anyway, I have a larger point. >I read about such stuff every day -- wacky new ways of >building "tamper proof tokens", "quantum cryptography", and >other mechanisms invented by smart people who don't >understand threat models at all. Yes. As I said, sometimes this stuff looks almost useless (like quantum cryptography), other times it looks like it may provide powerful tools, despite the fact that its designers don't know much about how to use those tools yet. The same is often true in cryptography, where we have some very theoretical work which sometimes ends up having enormous practical consequences. >We already have the term "snake oil" for a very different >type of bad security idea, and the term has proven valuable >for quashing such things. We need a term for this sort of >thing -- the steel tamper resistant lock added to the >tissue paper door on the wrong vault entirely, at great >expense, by a brilliant mind that does not understand the >underlying threat model at all. In my consulting days, I used to use the term "padlocking the screen door" for the related phenomenon of piling security on one part of the system while ignoring the bigger vulnerabilities. But this is a bit different >Perry --John Kelsey - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]
Re: solving the wrong problem
Perry E. Metzger wrote: We need a term for this sort of thing -- the steel tamper resistant lock added to the tissue paper door on the wrong vault entirely, at great expense, by a brilliant mind that does not understand the underlying threat model at all. Anyone have a good phrase in mind that has the right sort of flavor for describing this sort of thing? In a similar context, Whit Diffie once put up a nice graphic: A cozy little home protected by a picket fence. he fence consisted of a single picket that was a mile high ... while the rest of the perimeter went totally unprotected. So, unless/until somebody comes up with a better metaphor, I'd vote for "one-picket fence". I recognize that this metaphor is not sufficiently pejorative, because a single picket is at least arguably a step in the right direction, potentially a small part of a real solution. - The Cryptography Mailing List Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]