Re: [cryptography] How are expired code-signing certs revoked? (nonrepudiation)
On Wed, 7 Dec 2011, Jon Callas wrote: Nonrepudiation is a somewhat daft belief. Let me give a gedankenexperiment. Suppose Alice phones up Bob and says, "Hey, Bob, I just noticed that you have a digital nature from me. Well, ummm, I didn't do it. I have no idea how that could have happened, but it wasn't me." Nonrepudiation is the belief that the probability that Alice is telling the truth is less than 2^{-128}, assuming a 3K RSA key or 256-bit ECDSA key either with SHA-256. Moreover, if that signature was made with an ECDSA-521 bit key and SHA-512, then the probability she's telling the truth goes down to 2^{-256}. I don't know about you, but I think that the chance that Alice was hacked is greater than 1 in 2^128. In fact, I'm willing to believe that the probability that somehow space aliens, or Alice has an unknown evil twin, or some mad scientist has invented a cloning ray is greater than one in 2^128. Ironically, as the key size goes up, then Alice gets even better excuses. If we used a 1k-bit ECDSA key and a 1024-bit hash, then new reasonable excuses for Alice suggest themselves, like that perhaps she *considered* signing but didn't in this universe, but in a nearby universe (under the many-worlds interpretation of quantum mechanics, which all the cool kids believe in this week) she did, and that signature from a nearby universe somehow leaked over. This is silly - it assumes that there are only two intepretations of her statement: - a true "collision" (something arbitrary computes to her digital signature, which she did not actually invoke) which is indeed as astronomically unlikely as you propose. - another unlikely event whose probability happens to be higher than the "collision". But of course there is a much simpler, far more likely explanation, and that is that she is lying. However ... this did get me to thinking ... Can't this problem be solved by forcing Alice to tie her signing key to some other function(s)[1] that she would have a vested interest in protecting AND an attacker would have a vested interest in exploiting ? I'm thinking along the lines of: "I know Alice didn't get hacked because I see her bank account didn't get emptied, or I see that her ecommerce site did not disappear". "I know Alice didn't get hacked because the bitcoin wallet that we protected with her signing key still has X bitcoins in it, where X is the value I perceived our comms/transactions to be worth." Or whatever. [1] I have no implementation details for this. Especially the part about how Bob can determine that this tie has been made, and that the tie has sufficient value to assure him, etc. ___ cryptography mailing list cryptography@randombit.net http://lists.randombit.net/mailman/listinfo/cryptography
Re: [cryptography] How are expired code-signing certs revoked?
> Well the people involve are not dumb, > right? They know the capabilities of malware > as well as most anyone. Quite so, and that was what I found surprising. > and thereby emulate the success of the > operating system-based code signing systems. I'm probably misinterpreting either you or the proposed system. The usual code-signing system allows you to sign code with any cert you want, and it's up to the installer to choose to trust the cert (or something up the chain). The ieee Taggant system seemed to be implying that you needed to use certs from a specific issuer. That is a point where well-intentioned systems are can be too rigid and get ignored. What am I missing? Mike N - Original Message - From: Marsh Ray To: Michael Nelson Cc: "cryptography@randombit.net" Sent: Wednesday, December 21, 2011 3:04 PM Subject: Re: [cryptography] How are expired code-signing certs revoked? On 12/21/2011 04:24 PM, Michael Nelson wrote: > > Somewhat related: The IEEE is asking for proposals to develop and > operate a CA as a part of their Taggant System. This involves > signing to validate the usage of packers (compressing executables). > Packers can make it hard for anti-virus programs to spot malware. > > Does this strike you as impractical? Yes. > It seems obvious to me that it will be a wasted effort. Well the people involve are not dumb, right? They know the capabilities of malware as well as most anyone. Here's an overview of the proposed system: http://standards.ieee.org/news/2011/icsg_software.html > "Today malware uses code-obfuscation techniques to disguise the > malicious actions they take on your machine. As it is a hard problem > to overcome such protection technologies, computer-security tools > have started to become suspicious if an application uses code > protection. As a result, even legitimate content-protection > technology—generally put in place to either control application usage > or protect intellectual property (IP) from exposure or tampering—can > lead to false-positive detections. This forces technology vendors, > such as software publishers, to make a decision between security and > software accessibility. Joining the IEEE Software Taggant System > enables SafeNet to provide our customers a way to enjoy the benefits > of the proven IP protection without the risk of triggering a negative > response from common malware protection tools." So my interpretation of what they're essentially saying is this: There are mostly three categories of software that need to modify executable memory pages: A. Operating system loaders. EXEs and DLLs are things AV companies already scan. These modules can be code-signed today (and we all know that signed code is safe code). B. The "legitimate" code obfuscation systems currently for IP protection and DRM. C. Malware, which today uses code polymorphism ("unpackers") to evade signature-based detection. When today's host based antimalware systems see the code modifications happening, it doesn't have an easy way to distinguish category B from category C. So these researchers propose to move category B applications into category A (under the threat of "risk of triggering a negative response from common malware protection tools") and thereby emulate the success of the operating system-based code signing systems. Here's a classic article on the topic. In this case, the OS executable loader itself is used as the unpacker: http://uninformed.org/index.cgi?v=6&a=3&p=2 - Marsh ___ cryptography mailing list cryptography@randombit.net http://lists.randombit.net/mailman/listinfo/cryptography
Re: [cryptography] How are expired code-signing certs revoked?
On 12/21/2011 04:24 PM, Michael Nelson wrote: Somewhat related: The IEEE is asking for proposals to develop and operate a CA as a part of their Taggant System. This involves signing to validate the usage of packers (compressing executables). Packers can make it hard for anti-virus programs to spot malware. Does this strike you as impractical? Yes. It seems obvious to me that it will be a wasted effort. Well the people involve are not dumb, right? They know the capabilities of malware as well as most anyone. Here's an overview of the proposed system: http://standards.ieee.org/news/2011/icsg_software.html "Today malware uses code-obfuscation techniques to disguise the malicious actions they take on your machine. As it is a hard problem to overcome such protection technologies, computer-security tools have started to become suspicious if an application uses code protection. As a result, even legitimate content-protection technology—generally put in place to either control application usage or protect intellectual property (IP) from exposure or tampering—can lead to false-positive detections. This forces technology vendors, such as software publishers, to make a decision between security and software accessibility. Joining the IEEE Software Taggant System enables SafeNet to provide our customers a way to enjoy the benefits of the proven IP protection without the risk of triggering a negative response from common malware protection tools." So my interpretation of what they're essentially saying is this: There are mostly three categories of software that need to modify executable memory pages: A. Operating system loaders. EXEs and DLLs are things AV companies already scan. These modules can be code-signed today (and we all know that signed code is safe code). B. The "legitimate" code obfuscation systems currently for IP protection and DRM. C. Malware, which today uses code polymorphism ("unpackers") to evade signature-based detection. When today's host based antimalware systems see the code modifications happening, it doesn't have an easy way to distinguish category B from category C. So these researchers propose to move category B applications into category A (under the threat of "risk of triggering a negative response from common malware protection tools") and thereby emulate the success of the operating system-based code signing systems. Here's a classic article on the topic. In this case, the OS executable loader itself is used as the unpacker: http://uninformed.org/index.cgi?v=6&a=3&p=2 - Marsh ___ cryptography mailing list cryptography@randombit.net http://lists.randombit.net/mailman/listinfo/cryptography
Re: [cryptography] How are expired code-signing certs revoked?
> With that said, I propose that "code signing" and > then enforcing some kind of "use sanctioning" protocol > by the operating system vendor is an idiotic idea, and > fortunately one that has been proven as completely > impractical Somewhat related: The IEEE is asking for proposals to develop and operate a CA as a part of their Taggant System. This involves signing to validate the usage of packers (compressing executables). Packers can make it hard for anti-virus programs to spot malware. http://standards.ieee.org/develop/indconn/icsg/ Does this strike you as impractical? It seems obvious to me that it will be a wasted effort. Mike N ___ cryptography mailing list cryptography@randombit.net http://lists.randombit.net/mailman/listinfo/cryptography