Re: [cryptography] How are expired code-signing certs revoked?

2011-12-25 Thread Florian Weimer
* Jon Callas:

 Nonrepudiation is a somewhat daft belief. Let me give a
 gedankenexperiment. Suppose Alice phones up Bob and says, Hey, Bob,
 I just noticed that you have a digital nature from me. Well, ummm, I
 didn't do it. I have no idea how that could have happened, but it
 wasn't me. Nonrepudiation is the belief that the probability that
 Alice is telling the truth is less than 2^{-128}, assuming a 3K RSA
 key or 256-bit ECDSA key either with SHA-256. Moreover, if that
 signature was made with an ECDSA-521 bit key and SHA-512, then the
 probability she's telling the truth goes down to 2^{-256}.

Those numbers aren't really important.  In practice, Alice says, my
secretary signed those documents for me, without me actually knowing
their contents.  This has been successfully used to dispute
commitment to content covered by digital signatures, without a
compromise at the cryptographic level (or even hinting to it).

Two factors make this a plausible defence: It is not reasonable expect
that someone legally in charge can personally witness every business
transaction (this is true even for rather small businesses), and
applicable law generally forbids use of group keys or certificates
issued to legal persons.  Authorizing someone else to create
cryptographic signatures on your behalf is the only way out of this
dilemma.
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-25 Thread Jonathan Thornburg
Jon Callas wrote:
 Nonrepudiation is a somewhat daft belief. Let me give a
 gedankenexperiment. Suppose Alice phones up Bob and says, Hey, Bob,
 I just noticed that you have a digital nature from me. Well, ummm, I
 didn't do it. I have no idea how that could have happened, but it
 wasn't me. Nonrepudiation is the belief that the probability that
 Alice is telling the truth is less than 2^{-128}, assuming a 3K RSA
 key or 256-bit ECDSA key either with SHA-256. Moreover, if that
 signature was made with an ECDSA-521 bit key and SHA-512, then the
 probability she's telling the truth goes down to 2^{-256}.

On Sun, 25 Dec 2011, Florian Weimer wrote:
 Those numbers aren't really important.  In practice, Alice says, my
 secretary signed those documents for me, without me actually knowing
 their contents.

There are other alternatives as well:
* Alice says Yes, I clicked the 'sign' button, but the document on my
  screen didn't say 'transfer all my money to Bob', it said 'transfer
  my next month's rent to $landlord'.  Hmm, just as I was clicking the
  'sign' button a bunch of stuff flashed up on the screen for a fraction
  of a second, then went away before I could read it.  That kind of thing
  happens a lot with my computer these days.  It's really irritating,
  isn't it?  But on the positive side, look at these cute dancing bunnies
  I downloaded a few weeks ago.
* Alice says Hey, Bob, I just noticed that you have a digital nature
  from me. Well, ummm, I didn't do it. I have no idea how that could have
  happened, but it wasn't me.  I don't even know what a digital signature
  is, so I'm really really doubtful that I ever did one.  Hey, look at
  these cute dancing bunnies  I downloaded a few weeks ago.

In practice, a digital signature establishes a binding between some 
piece of software which knows Alice's private key, and some bit-string
(a document).  But the legal system wants a binding to Alice's conscious
intent, which is a *very* different thing.

-- 
-- Jonathan Thornburg [remove -animal to reply] 
jth...@astro.indiana-zebra.edu
   Dept of Astronomy  IUCSS, Indiana University, Bloomington, Indiana, USA
   Washing one's hands of the conflict between the powerful and the
powerless means to side with the powerful, not to be neutral.
  -- quote by Freire / poster by Oxfam
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked? (nonrepudiation)

2011-12-22 Thread Adam Back

Stefan Brands credentials [1] have an anti-lending feature where you have to
know all of the private components in order to make a signature with it.

My proposal related to what you said was to put a high value ecash coin as
one of the private components.  Now they have a direct financial incentive -
if they get hacked and their private keys stolen they lose $1m untraceably.

Now thats quite reassuring - and encapsulates a smart contract where they
get an automatic fine, or good behavior bond.  I think you could put a
bitcoin in there instead of a high value Brands based ecash coin.  Then you
could even tell that it wasnt collected by looking in the spend list.

Adam

[1] http://www.cypherspace.org/credlib/ a library implementing Brands
credentials - it has pointers to the uprove spec, Brands thesis in pdf form
etc.

On Thu, Dec 22, 2011 at 07:17:21AM +, John Case wrote:


On Wed, 7 Dec 2011, Jon Callas wrote:

Nonrepudiation is a somewhat daft belief. Let me give a 
gedankenexperiment. Suppose Alice phones up Bob and says, Hey, 
Bob, I just noticed that you have a digital nature from me. Well, 
ummm, I didn't do it. I have no idea how that could have happened, 
but it wasn't me. Nonrepudiation is the belief that the 
probability that Alice is telling the truth is less than 2^{-128}, 
assuming a 3K RSA key or 256-bit ECDSA key either with SHA-256. 
Moreover, if that signature was made with an ECDSA-521 bit key and 
SHA-512, then the probability she's telling the truth goes down to 
2^{-256}.


I don't know about you, but I think that the chance that Alice was 
hacked is greater than 1 in 2^128. In fact, I'm willing to believe 
that the probability that somehow space aliens, or Alice has an 
unknown evil twin, or some mad scientist has invented a cloning ray 
is greater than one in 2^128. Ironically, as the key size goes up, 
then Alice gets even better excuses. If we used a 1k-bit ECDSA key 
and a 1024-bit hash, then new reasonable excuses for Alice suggest 
themselves, like that perhaps she *considered* signing but didn't 
in this universe, but in a nearby universe (under the many-worlds 
interpretation of quantum mechanics, which all the cool kids 
believe in this week) she did, and that signature from a nearby 
universe somehow leaked over.



This is silly - it assumes that there are only two intepretations of 
her statement:


- a true collision (something arbitrary computes to her digital 
signature, which she did not actually invoke) which is indeed as 
astronomically unlikely as you propose.


- another unlikely event whose probability happens to be higher than 
the collision.


But of course there is a much simpler, far more likely explanation, 
and that is that she is lying.


However ... this did get me to thinking ...

Can't this problem be solved by forcing Alice to tie her signing key 
to some other function(s)[1] that she would have a vested interest in 
protecting AND an attacker would have a vested interest in exploiting 
?


I'm thinking along the lines of:

I know Alice didn't get hacked because I see her bank account didn't 
get emptied, or I see that her ecommerce site did not disappear.


I know Alice didn't get hacked because the bitcoin wallet that we 
protected with her signing key still has X bitcoins in it, where X is 
the value I perceived our comms/transactions to be worth.


Or whatever.

___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked? (nonrepudiation)

2011-12-22 Thread ianG

On 22/12/11 18:17 PM, John Case wrote:


On Wed, 7 Dec 2011, Jon Callas wrote:


Nonrepudiation is a somewhat daft belief.


+1


Let me give a gedankenexperiment. Suppose Alice phones up Bob and 
says, Hey, Bob, I just noticed that you have a digital nature from 
me. Well, ummm, I didn't do it. I have no idea how that could have 
happened, but it wasn't me. Nonrepudiation is the belief that the 
probability that Alice is telling the truth is less than 2^{-128}, 
assuming a 3K RSA key or 256-bit ECDSA key either with SHA-256. 
Moreover, if that signature was made with an ECDSA-521 bit key and 
SHA-512, then the probability she's telling the truth goes down to 
2^{-256}.


I don't know about you, but I think that the chance that Alice was 
hacked is greater than 1 in 2^128. In fact, I'm willing to believe 
that the probability that somehow space aliens, or Alice has an 
unknown evil twin, or some mad scientist has invented a cloning ray 
is greater than one in 2^128. Ironically, as the key size goes up, 
then Alice gets even better excuses. If we used a 1k-bit ECDSA key 
and a 1024-bit hash, then new reasonable excuses for Alice suggest 
themselves, like that perhaps she *considered* signing but didn't in 
this universe, but in a nearby universe (under the many-worlds 
interpretation of quantum mechanics, which all the cool kids believe 
in this week) she did, and that signature from a nearby universe 
somehow leaked over.



This is silly - it assumes that there are only two intepretations of 
her statement:


- a true collision (something arbitrary computes to her digital 
signature, which she did not actually invoke) which is indeed as 
astronomically unlikely as you propose.


- another unlikely event whose probability happens to be higher than 
the collision.


But of course there is a much simpler, far more likely explanation, 
and that is that she is lying.



Actually there is a much simpler, far more likely explanation:  she's 
telling the truth:


   she has no idea how it happened or what it means.

The problem of digital signing is that most all the crypto world think 
that the challenge is to create a a cryptographically secure copy of a 
signature.  It isn't.


The challenge is to emulate signing, not emulate the signature.  Signing 
is something else.  It is, in short, making a mark to record a moment in 
time (in this case likely agreeing to something) so as to remember that 
moment.


In law, we can remember that moment in time by thrusting the image of 
the signature in front of Alice and saying did you make that mark? or 
in more cautions terms is that your signature?  Now, at this stage, if 
it looks like she did make the mark, *or* it looks like her signature, 
we can now clarify things fairly quickly.  You can invent a decision 
tree here, where the interrogation goes one way or another depending on 
how she responds.


Now try the same thing with a digsig.

Alice, did you run the formula that resulted in this number:
   389274928398238742389472398472983...
over this other number:
   982374982374984759347590348239847...
that stamped over this over DOC file?

The right answer, the *ONLY* answer is:  I have no clue what you just 
said?


So it fails right there.  A digsig is completely and utterly the most 
lousy signature ever invented because it has no capability to record in 
the mind of the utterer the event at the time.  It's disgustingly bad.  
A 4 year old child could do better, and often does, with paper and crayons.


(Then, you can imagine the mad-techo-french-smartcard-scientists saying,

non!  Sacre blue!  Moment, sil vous plait!
We put le key en le secure plastique un we bla de bla...

Well no.  It has no more validity as a signature because it fails to 
record the moment to the mind of Alice.  Sorry.  It ain't signing.)





However ... this did get me to thinking ...

Can't this problem be solved by forcing Alice to tie her signing key 
to some other function(s)[1] that she would have a vested interest in 
protecting AND an attacker would have a vested interest in exploiting ?


I'm thinking along the lines of:

I know Alice didn't get hacked because I see her bank account didn't 
get emptied, or I see that her ecommerce site did not disappear.


I know Alice didn't get hacked because the bitcoin wallet that we 
protected with her signing key still has X bitcoins in it, where X is 
the value I perceived our comms/transactions to be worth.


Or whatever.


[1] I have no implementation details for this.  Especially the part 
about how Bob can determine that this tie has been made, and that the 
tie has sufficient value to assure him, etc.


Yeah, so the protocol known as signing changes depending on the purpose 
and value :)



(Oh, yeah, and that's before we get to non-repudiation which 
clashes with law principles at its most foundational.. and if it 
ever happened would lead to mass rioting and plastique bonfires and 

Re: [cryptography] How are expired code-signing certs revoked?

2011-12-21 Thread Marsh Ray

On 12/21/2011 04:24 PM, Michael Nelson wrote:


Somewhat related: The IEEE is asking for proposals to develop and
operate a CA as a part of their Taggant System.  This involves
signing to validate the usage of packers (compressing executables).
Packers can make it hard for anti-virus programs to spot malware.

Does this strike you as impractical?


Yes.


It seems obvious to me that it will be a wasted effort.


Well the people involve are not dumb, right? They know the capabilities 
of malware as well as most anyone.


Here's an overview of the proposed system:

http://standards.ieee.org/news/2011/icsg_software.html

Today malware uses code-obfuscation techniques to disguise the
malicious actions they take on your machine. As it is a hard problem
to overcome such protection technologies, computer-security tools
have started to become suspicious if an application uses code
protection. As a result, even legitimate content-protection
technology—generally put in place to either control application usage
or protect intellectual property (IP) from exposure or tampering—can
lead to false-positive detections. This forces technology vendors,
such as software publishers, to make a decision between security and
software accessibility. Joining the IEEE Software Taggant System
enables SafeNet to provide our customers a way to enjoy the benefits
of the proven IP protection without the risk of triggering a negative
response from common malware protection tools.


So my interpretation of what they're essentially saying is this:

There are mostly three categories of software that need to modify 
executable memory pages:


A. Operating system loaders. EXEs and DLLs are things AV companies 
already scan. These modules can be code-signed today (and we all know 
that signed code is safe code).


B. The legitimate code obfuscation systems currently for IP protection 
and DRM.


C. Malware, which today uses code polymorphism (unpackers) to evade 
signature-based detection.


When today's host based antimalware systems see the code modifications 
happening, it doesn't have an easy way to distinguish category B from 
category C. So these researchers propose to move category B applications 
into category A (under the threat of risk of triggering a negative 
response from common malware protection tools) and thereby emulate the 
success of the operating system-based code signing systems.


Here's a classic article on the topic. In this case, the OS executable 
loader itself is used as the unpacker:

http://uninformed.org/index.cgi?v=6a=3p=2

- Marsh
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-21 Thread Michael Nelson
 Well the people involve are not dumb,
 right? They know
the capabilities of malware
 as well as most anyone.

Quite so, and that was what I found surprising.

 and thereby emulate the success of the
 operating system-based code signing systems.


I'm probably misinterpreting either you or the proposed system.  The usual 
code-signing system allows you to sign code with any cert you want, and it's up 
to the installer to choose to trust the cert (or something up the chain).  The 
ieee Taggant system seemed to be implying that you needed to use certs from a 
specific issuer.  That is a point where well-intentioned systems are can be too 
rigid and get ignored.  What am I missing?

Mike N


- Original Message -
From: Marsh Ray ma...@extendedsubset.com
To: Michael Nelson nelson_mi...@yahoo.com
Cc: cryptography@randombit.net cryptography@randombit.net
Sent: Wednesday, December 21, 2011 3:04 PM
Subject: Re: [cryptography] How are expired code-signing certs revoked?

On 12/21/2011 04:24 PM, Michael Nelson wrote:
 
 Somewhat related: The IEEE is asking for proposals to develop and
 operate a CA as a part of their Taggant System.  This involves
 signing to validate the usage of packers (compressing executables).
 Packers can make it hard for anti-virus programs to spot malware.
 
 Does this strike you as impractical?

Yes.

 It seems obvious to me that it will be a wasted effort.

Well the people involve are not dumb, right? They know the capabilities of 
malware as well as most anyone.

Here's an overview of the proposed system:

http://standards.ieee.org/news/2011/icsg_software.html
 Today malware uses code-obfuscation techniques to disguise the
 malicious actions they take on your machine. As it is a hard problem
 to overcome such protection technologies, computer-security tools
 have started to become suspicious if an application uses code
 protection. As a result, even legitimate content-protection
 technology—generally put in place to either control application usage
 or protect intellectual property (IP) from exposure or tampering—can
 lead to false-positive detections. This forces technology vendors,
 such as software publishers, to make a decision between security and
 software accessibility. Joining the IEEE Software Taggant System
 enables SafeNet to provide our customers a way to enjoy the benefits
 of the proven IP protection without the risk of triggering a negative
 response from common malware protection tools.

So my interpretation of what they're essentially saying is this:

There are mostly three categories of software that need to modify executable 
memory pages:

A. Operating system loaders. EXEs and DLLs are things AV companies already 
scan. These modules can be code-signed today (and we all know that signed code 
is safe code).

B. The legitimate code obfuscation systems currently for IP protection and 
DRM.

C. Malware, which today uses code polymorphism (unpackers) to evade 
signature-based detection.

When today's host based antimalware systems see the code modifications 
happening, it doesn't have an easy way to distinguish category B from category 
C. So these researchers propose to move category B applications into category A 
(under the threat of risk of triggering a negative response from common 
malware protection tools) and thereby emulate the success of the operating 
system-based code signing systems.

Here's a classic article on the topic. In this case, the OS executable loader 
itself is used as the unpacker:
http://uninformed.org/index.cgi?v=6a=3p=2

- Marsh

___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-18 Thread M.R.

On 2011-12-07 16:31, Jon Callas wrote:

There are many things about code signing that I don't think I understand.


same here.

But I do understand something about the code creation, dissemination
and the trust between code creator and code user (primary parties),
and the role of the operating system vendor (a tertiary party) as
an intermediary between the code creator and the code user.

With that said, I propose that code signing and then enforcing some
kind of use sanctioning protocol by the operating system vendor is
an idiotic idea, and fortunately one that has been proven as completely
impractical and ill-aligned with the interest of the two primary 
parties, and thus continually rejected in practice.


What should be signed and tusted (or not trusted) is not the code,
but the channel by which the code is distributed.

Mark R.

___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-18 Thread Jon Callas

On Dec 18, 2011, at 10:19 AM, M.R. wrote:

 On 2011-12-07 16:31, Jon Callas wrote:
 There are many things about code signing that I don't think I understand.
 
 same here.
 
 But I do understand something about the code creation, dissemination
 and the trust between code creator and code user (primary parties),
 and the role of the operating system vendor (a tertiary party) as
 an intermediary between the code creator and the code user.
 
 With that said, I propose that code signing and then enforcing some
 kind of use sanctioning protocol by the operating system vendor is
 an idiotic idea, and fortunately one that has been proven as completely
 impractical and ill-aligned with the interest of the two primary parties, and 
 thus continually rejected in practice.
 
 What should be signed and tusted (or not trusted) is not the code,
 but the channel by which the code is distributed.

Which is precisely what can't be done, in the general case.

It's really, really, doable in the singular case. If the channel signs the code 
(which is what Apple does on the App Store), then sure, Alice is your auntie. 

But when developer D has code they sign *themselves* with a cert given from 
signatory S, and delivered to marketplace M, you end up with some sort of 
DSM-defined insanity. There's no responsibility anywhere. The worst, though, is 
to go to the signer and say, This is another fine mess you've gotten me into, 
Stanley.

Jon
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-11 Thread ianG

On 8/12/11 02:11 AM, d...@geer.org wrote:

Another wrinkle, at least as a logic problem, would be
whether you can revoke the signing cert for a CRL and
what, exactly, would that mean -- particularly if the
last known good date is well astern and hence the
revocation would optimally be retroactive.


Is the logical answer here that you have to treat the signing cert for a 
CRL at the same level as the root concerned?


So a CRL-signing cert for a sub-root (generally one and the same thing) 
would (both) want to be revoked at the root level, that is, appear in 
the CRL as signed by the root.  Whether it works that way in practice, I 
don't know.  I suppose I should...


In PKI it's a fairly well established principle that the layer one up 
has to revoke [0].  So, when some roots needed to be revoked recently, 
browsers had to ship new software.  Vendors are the ueber-CA.  
Therefore, the CRL/OCSP certs for a root can only be revoked at software 
level.



--dan, quite possibly in a rat hole


iang, we're all in rat holes together



[0] Unlike PGP where self can revoke self;  there are no layers.
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-11 Thread Jon Callas
On 10 Dec, 2011, at 11:58 PM, Peter Gutmann wrote:

 Jon Callas j...@callas.org writes:
 
 If someone actually built such combination of OS and marketplace, it would
 work for the users very well, but developers would squawk about it. Properly
 done, it could drop malware rates to close to nil.
 
 Oh, developers would do more than squawk about it.  Both Java and .NET
 actually support the capability-based security that you mentioned, but it's so
 painful to use that it's either turned off by default (.NET's 'trust
 level=Full') or was turned off after massive developer backlash (Java).
 Even the very minimal capabilities used by Android are failing because of the
 dancing bunnies and confused deputy problems, and because developers request
 as close to any/any as they can get just in case (exacerbating the confused
 deputy problem).
 
 (One of the nice things about Android is that it's fairly easy to decompile
 and analyse the code, so there have been all sorts of papers published on its
 capability-based security mechanisms using this technique.  It's serving as a
 nice real-world empirical evaluation of failure modes of capability-based
 security systems.  I'm sure someone could get a good thesis out of it at some
 point).
 
 Properly done, it could drop malware rates to close to nil.
 
 Objection, tautology: Properly done, any (malware-related) security measure
 would drop malware rates close to nil.  The problem is doing it properly...
 

Yes, doing it properly is the key and I'll assert that Apple is doing a pretty 
good approximation of it. They are doing more or less what I described -- good 
coding enforcement backed up with digital signatures. There are plenty of 
people squawking about it. I know developers who've thrown up their hands and 
there is plenty of grumpiness I've heard. Some of it reasonable grumpiness, too.

But the end result for the users is that malware rate is close to zero. The 
system is by no means perfect, and has side-effects. But the times when 
something slipped through the net are so few that they're notable still. (And 
some of the malware has been kinda charming, like the flashlight app that had a 
hidden SOCKS proxy that let people use it for tethering.) More importantly, the 
system does not throw things at the users that they're incapable of handling, 
like the Android way of just informing you what capabilities an app needs. 
People can and do just hand devices to their kids and let them use them with no 
ill effects.

Jon


___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-10 Thread Peter Gutmann
Nico Williams n...@cryptonector.com writes:
On Fri, Dec 9, 2011 at 4:41 PM, Jeffrey Walton noloa...@gmail.com wrote:

 Android also make the application a security principal for resource
 sharing (its a smarter walled garden approach). Its an awesome
 approach, especially when compared to Windows and *nix where sharing
 is generally based upon a login context and enforced through DACLs.

That's what I meant by isolation :)

... dancing bunnies ... confused deputy ...

(This is a serious problem on Android phones.  The permissions systems is much
nicer than NT/Unix - mostly because it'd be hard to come up with something
that's worse - but it's fatally vulnerable to the dancing bunnies and confused
deputy problems.  For example one recent analysis of Android phones from a
range of vendors found that, out-of-the-box, before any user apps were even
installed, all of them leaked critical capabilities, all the way up to
MASTER_CLEAR for the phone).

Peter.
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-10 Thread Jon Callas

On 9 Dec, 2011, at 9:15 PM, Peter Gutmann wrote:

 Jon Callas j...@callas.org writes:
 
 If it were hard to get signing certs, then we as a community of developers
 would demonize the practice as having to get a license to code.
 
 WHQL is a good analogy for the situations with certificates, it has to be made
 inclusive enough that people aren't unfairly excluded, but exclusive enough
 that it provides a guarantee of quality.  Pick any one of those two.
 
 (I have a much longer analysis of this, a bit too much to post here, but
 there's a long history of vendors gaming WHQL and the certifiers looking the
 other way, just as there is with browser vendors looking the other way when a
 CA screws up, although in the case of hardware vendors the action is
 deliberate rather than accidental).

Sure, and that's why the assurance system and the signatures have to be tied 
together and the incentives have to be aligned. In a software market where the 
app store itself is doing the validation, doing the enforcement, signing the 
code, and taking the responsibility for both delivering the software and 
backfilling the inevitable errors, you'll see the *system* lower malware. But 
even in that, it's the system that's doing it, not digital signatures. The 
signatures are merely the wax seals. The quality system has to be built to 
create and deliver quality. That is the sine qua non of this whole thing.

I think we agree that trying to build quality by giving certificates to 
developers is a fantasy at best.

Jon

___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-10 Thread Peter Gutmann
Jon Callas j...@callas.org writes:

If someone actually built such combination of OS and marketplace, it would
work for the users very well, but developers would squawk about it. Properly
done, it could drop malware rates to close to nil.

Oh, developers would do more than squawk about it.  Both Java and .NET
actually support the capability-based security that you mentioned, but it's so
painful to use that it's either turned off by default (.NET's 'trust
level=Full') or was turned off after massive developer backlash (Java).
Even the very minimal capabilities used by Android are failing because of the
dancing bunnies and confused deputy problems, and because developers request
as close to any/any as they can get just in case (exacerbating the confused
deputy problem).

(One of the nice things about Android is that it's fairly easy to decompile
and analyse the code, so there have been all sorts of papers published on its
capability-based security mechanisms using this technique.  It's serving as a
nice real-world empirical evaluation of failure modes of capability-based
security systems.  I'm sure someone could get a good thesis out of it at some
point).

Properly done, it could drop malware rates to close to nil.

Objection, tautology: Properly done, any (malware-related) security measure
would drop malware rates close to nil.  The problem is doing it properly...

Peter.
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-09 Thread Jon Callas

On 8 Dec, 2011, at 8:27 PM, Peter Gutmann wrote:

 In any case getting signing certs really isn't hard at all.  I once managed 
 it 
 in under a minute (knowing which Google search term to enter to find caches 
 of 
 Zeus stolen keys helps :-).  That's as an outsider, if you're working inside 
 the malware ecosystem you'd probably get them in bulk from whoever's dealing 
 in them (single botnets have been reported with thousands of stolen keys and 
 certs in their data stores, so it's not like the bad guys are going to run 
 out 
 of them in a hurry).
 
 Unlike credit cards and bank accounts and whatnot we don't have price figures 
 for stolen certs, but I suspect it's not that much.

If it were hard to get signing certs, then we as a community of developers 
would demonize the practice as having to get a license to code.

Jon

___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-09 Thread Thor Lancelot Simon
On Fri, Dec 09, 2011 at 01:01:05PM -0800, Jon Callas wrote:
 
 
 If you have a certificate issue a revocation for itself, there is an obvious, 
 correct interpretation. That interpretation is what Michael Heyman said, and 
 what OpenPGP does. That certificate is revoked and any subordinate 
 certificates are also implicitly revoked. It's also like making a CRL for 
 everything you issued.

Indeed.  Non-temporal logic is a very poor substitute for temporal logic
in any real-world situation.  But some simple definitions should make the
matter clear in any event:

Q: When is a certificate valid?
A: Until it is revoked, and if some other conditions are met.

Q: When is a certificate revoked?
A: At any time AFTER an authorized party revokes the certificate.

Q: Who is an authorized party for the purpose of revoking a certificate?
A: The signer of the certificate*

* one can envision systems in which the rule is ...or the party identified
  by the certificate, too, but when talking about PKI, generally, that is
  not the rule that is used.  Fortunately self-signed certs let us reason
  about this issue in a vacuum.

Now the problem degenerates to the basic quarrel over retroactive
revocations.  But, depending what your norms are there, with appropriate
choice of a temporal frame of reference it's no harder to solve.

Thor
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-09 Thread Steven Bellovin

On Dec 9, 2011, at 3:46 18PM, Jon Callas wrote:

 
 On 8 Dec, 2011, at 8:27 PM, Peter Gutmann wrote:
 
 In any case getting signing certs really isn't hard at all.  I once managed 
 it 
 in under a minute (knowing which Google search term to enter to find caches 
 of 
 Zeus stolen keys helps :-).  That's as an outsider, if you're working inside 
 the malware ecosystem you'd probably get them in bulk from whoever's dealing 
 in them (single botnets have been reported with thousands of stolen keys and 
 certs in their data stores, so it's not like the bad guys are going to run 
 out 
 of them in a hurry).
 
 Unlike credit cards and bank accounts and whatnot we don't have price 
 figures 
 for stolen certs, but I suspect it's not that much.
 
 If it were hard to get signing certs, then we as a community of developers 
 would demonize the practice as having to get a license to code.
 
Peter is talking about stolen certs, which for most parts of the development
community aren't a prerequisite...  But there's an interesting dilemma here
if we insist on all code being signed.

Assume that a code-signing cert costs {$,€,£,zorkmid}1/year.  Everyone but
large companies would scream.  Now assume the cost is {$,€,£,zorkmid}.01/year
or even free.  At that price, it's a nuisance factor, and would be issued via
a simple web interface.  Simple web interfaces are scriptable (and we all know
the limits of captchas), which means that malware could include a get a cert
routine for the next, mutated generation of itself.  In fact, they're largely
price-insensitive, since they'd be programmed with a stash of stolen credit
cards


--Steve Bellovin, https://www.cs.columbia.edu/~smb





___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-09 Thread Nico Williams
On Fri, Dec 9, 2011 at 4:08 PM, Steven Bellovin s...@cs.columbia.edu wrote:
 On Dec 9, 2011, at 3:46 18PM, Jon Callas wrote:
 If it were hard to get signing certs, then we as a community of developers 
 would demonize the practice as having to get a license to code.

 Peter is talking about stolen certs, which for most parts of the development
 community aren't a prerequisite...  But there's an interesting dilemma here
 if we insist on all code being signed.

 Assume that a code-signing cert costs {$,€,£,zorkmid}1/year.  Everyone but
 large companies would scream.  Now assume the cost is {$,€,£,zorkmid}.01/year
 or even free.  At that price, it's a nuisance factor, and would be issued via
 a simple web interface.  Simple web interfaces are scriptable (and we all know
 the limits of captchas), which means that malware could include a get a cert
 routine for the next, mutated generation of itself.  In fact, they're largely
 price-insensitive, since they'd be programmed with a stash of stolen credit
 cards

This strengthens the argument for digital signatures as a means of
providing upgrade continuity and related application grouping /
isolation, as in the Android model.  No need for a PKI then, no need
to pay for certificates.

In the Android model it shouldn't matter that malware might be signed.
 What should matter is that malware should not be able to gain control
of the device or other user/app data on that device, and, perhaps,
that the user not even get a chance to install said malware, not
because the malware's signatures don't chain up to a trusted CA but
because the app store doesn't publish it and the user uses only
trusted app stores.  Neither of the last two is easy to ensure though
:(

Nico
--
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-09 Thread Randall Webmail
From: Nico Williams n...@cryptonector.com

 What should matter is that malware should not be able to gain control
of the device or other user/app data on that device, and, perhaps,
that the user not even get a chance to install said malware, not
because the malware's signatures don't chain up to a trusted CA but
because the app store doesn't publish it and the user uses only
trusted app stores.  Neither of the last two is easy to ensure though

And yet we see things like someone (apparently) sneakernetting a thumb-drive 
from an infected Internet Cafe to the SIPR network: 
http://www.washingtonpost.com/national/national-security/cyber-intruder-sparks-response-debate/2011/12/06/gIQAxLuFgO_story.html

If the USG can't even keep thumb drives off of SIPR, isn't the whole game 
doomed to failure?   (What genius thought it would be a good idea to put USB 
ports on SIPR-connected boxes, anyway?)
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-09 Thread Jeffrey Walton
On Fri, Dec 9, 2011 at 5:28 PM, Nico Williams n...@cryptonector.com wrote:
 On Fri, Dec 9, 2011 at 4:08 PM, Steven Bellovin s...@cs.columbia.edu wrote:
 On Dec 9, 2011, at 3:46 18PM, Jon Callas wrote:
 If it were hard to get signing certs, then we as a community of developers 
 would demonize the practice as having to get a license to code.

 Peter is talking about stolen certs, which for most parts of the development
 community aren't a prerequisite...  But there's an interesting dilemma here
 if we insist on all code being signed.

 Assume that a code-signing cert costs {$,€,£,zorkmid}1/year.  Everyone 
 but
 large companies would scream.  Now assume the cost is {$,€,£,zorkmid}.01/year
 or even free.  At that price, it's a nuisance factor, and would be issued via
 a simple web interface.  Simple web interfaces are scriptable (and we all 
 know
 the limits of captchas), which means that malware could include a get a 
 cert
 routine for the next, mutated generation of itself.  In fact, they're largely
 price-insensitive, since they'd be programmed with a stash of stolen credit
 cards

 This strengthens the argument for digital signatures as a means of
 providing upgrade continuity and related application grouping /
 isolation, as in the Android model.  No need for a PKI then, no need
 to pay for certificates.
Android also make the application a security principal for resource
sharing (its a smarter walled garden approach). Its an awesome
approach, especially when compared to Windows and *nix where sharing
is generally based upon a login context and enforced through DACLs.

 In the Android model it shouldn't matter that malware might be signed.
  What should matter is that malware should not be able to gain control
 of the device or other user/app data on that device,
Right.

 that the user not even get a chance to install said malware, not
 because the malware's signatures don't chain up to a trusted CA but
 because the app store doesn't publish it and the user uses only
 trusted app stores.  Neither of the last two is easy to ensure though
It never hurts to wish.

Jeff
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-09 Thread Steven Bellovin

On Dec 9, 2011, at 5:41 04PM, Randall Webmail wrote:

 From: Nico Williams n...@cryptonector.com
 
 What should matter is that malware should not be able to gain control
 of the device or other user/app data on that device, and, perhaps,
 that the user not even get a chance to install said malware, not
 because the malware's signatures don't chain up to a trusted CA but
 because the app store doesn't publish it and the user uses only
 trusted app stores.  Neither of the last two is easy to ensure though
 
 And yet we see things like someone (apparently) sneakernetting a thumb-drive 
 from an infected Internet Cafe to the SIPR network: 
 http://www.washingtonpost.com/national/national-security/cyber-intruder-sparks-response-debate/2011/12/06/gIQAxLuFgO_story.html
 
 If the USG can't even keep thumb drives off of SIPR, isn't the whole game 
 doomed to failure?   (What genius thought it would be a good idea to put USB 
 ports on SIPR-connected boxes, anyway?)

How do you import new intelligence data to it?  New software?  Updates?
New anti-virus definitions?  Patches for security holes?  Your external
backup drive?  Your wireless mouse for classified Powerpoint
presentations (based on
http://www.nytimes.com/2010/04/27/world/27powerpoint.html I suspect that
such things indeed happen) I've heard tell of superglue in the USB
ports and I've seen commercial software that tries to limit which
specific storage devices can be connected to (of course) Windows boxes.

Yes, one can imagine technical solutions to all of these, like NSA-run
central software servers and restricted machines to which new data can
be introduced and a good registry of allowed disks and banning both
Powerpoint and the mindset that overuses it.  Is that operationally
realistic, especially in a war environment where you don't have adequate
bandwidth back to Ft.  Meade?  (Hunt up the articles on the moaning and
groaning when DoD banned flash drives.)

The purpose of a system is not to be secure.  Rather, it's to help you
accomplish something else.  Being secure is one aspect of helping to
accomplish something, but it's not the only one.  The trick isn't to be
secure, it's to be secure enough while still getting the job done.
Sometimes, relying on training rather than technology is the right
answer.  Obviously, per that article, it wasn't enough, but it doesn't
mean the approach was wrong; perhaps other approaches would have had
even worse failures.



--Steve Bellovin, https://www.cs.columbia.edu/~smb





___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-09 Thread Nico Williams
On Fri, Dec 9, 2011 at 4:41 PM, Jeffrey Walton noloa...@gmail.com wrote:
 This strengthens the argument for digital signatures as a means of
 providing upgrade continuity and related application grouping /
 isolation, as in the Android model.  No need for a PKI then, no need
 to pay for certificates.
 Android also make the application a security principal for resource
 sharing (its a smarter walled garden approach). Its an awesome
 approach, especially when compared to Windows and *nix where sharing
 is generally based upon a login context and enforced through DACLs.

That's what I meant by isolation :)

 It never hurts to wish.

:(
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-09 Thread Jeffrey Walton
On Fri, Dec 9, 2011 at 6:00 PM, Nico Williams n...@cryptonector.com wrote:
 On Fri, Dec 9, 2011 at 4:41 PM, Jeffrey Walton noloa...@gmail.com wrote:
 This strengthens the argument for digital signatures as a means of
 providing upgrade continuity and related application grouping /
 isolation, as in the Android model.  No need for a PKI then, no need
 to pay for certificates.
 Android also make the application a security principal for resource
 sharing (its a smarter walled garden approach). Its an awesome
 approach, especially when compared to Windows and *nix where sharing
 is generally based upon a login context and enforced through DACLs.

 That's what I meant by isolation :)
Gotcha - my bad.
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-09 Thread dan

  
  If the USG can't even keep thumb drives off of SIPR, isn't the
  whole game doomed to failure?   (What genius thought it would be
  a good idea to put USB ports on SIPR-connected boxes, anyway?)
 

USG is, like all enterprises, struggling with consumerization
such as whether cloud services (like Gmail) have so great a cost
efficiency as to be irresistible or choosing whether individual USG
employees should be forbidden or required to own carry cell phones
(trackability).  There is current budget-driven consideration as to
whether to collapse JWICS and SIPRnet into one fabric.  As the
Internet-of-things rolls forward, the effort required to maintain
even a constant level of risk will be all but entirely infeasible,
and university-level social science research appears poised to make
it possible to imagine a government herded by continuous plebiscite
(such as by automatic, language-independent categorization of all
Tweets in real time), thus ending long-term thinking altogether.

A republic, if you can keep it.

--dan

___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-09 Thread Peter Gutmann
Jon Callas j...@callas.org writes:

If it were hard to get signing certs, then we as a community of developers
would demonize the practice as having to get a license to code.

WHQL is a good analogy for the situations with certificates, it has to be made
inclusive enough that people aren't unfairly excluded, but exclusive enough
that it provides a guarantee of quality.  Pick any one of those two.

(I have a much longer analysis of this, a bit too much to post here, but
there's a long history of vendors gaming WHQL and the certifiers looking the
other way, just as there is with browser vendors looking the other way when a
CA screws up, although in the case of hardware vendors the action is
deliberate rather than accidental).

Peter.
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-08 Thread Darren J Moffat

On 12/07/11 14:42, William Whyte wrote:

Well, I think the theoretically correct answer is that you *should*...
these days all the installers can be available online, after all.


Except when the installer CD you need is the one for the network driver 
on the new machine without which you can't get online !


I've been in this situation several times before.

--
Darren J Moffat
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-08 Thread Marsh Ray

On 12/08/2011 09:16 AM, Darren J Moffat wrote:

On 12/07/11 14:42, William Whyte wrote:

Well, I think the theoretically correct answer is that you *should*...
these days all the installers can be available online, after all.


Except when the installer CD you need is the one for the network driver
on the new machine without which you can't get online !


There are systems that aren't online, and there are systems that 
shouldn't be online for good reasons. For example the power grid.


If we consistently neglect this scenario, then if the Internet ever 
suffers more than a brief outage we could find ourselves rebuilding 
society from the iron age.


- Marsh
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-08 Thread Jeffrey Walton
2011/12/7 Marsh Ray ma...@extendedsubset.com:

 On 12/07/2011 07:01 PM, lodewijk andré de la porte wrote:

 I figured it'd be effective to create a security awareness group
 figuring the most prominent (and only effective) way to show people
 security is a priority is by placing a simple marking, something like
  this site isn't safe!


 I thought the international symbol for that was already agreed upon:
 goatse.cx



 On 12/07/2011 07:13 PM, lodewijk andré de la porte wrote:

 I'm afraid signing software is multiple levels of bullocks. Imagine a
  user just clicking yes when something states Unsigned software, do
 you really want to install?.


 You're just thinking of a few code signing schemes that you have direct
 experience with.

 Apple's iPhone app store code signing is far more effective for example.
https://krebsonsecurity.com/2011/11/apple-took-3-years-to-fix-finfisher-trojan-hole/
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-08 Thread mhey...@gmail.com
On Wed, Dec 7, 2011 at 4:32 PM, Peter Gutmann pgut...@cs.auckland.ac.nz wrote:

  In the presence of such a [self-revoking] revocation [of a root certificate]
 applications can react in one of three ways: they can accept the CRL
 that revokes the certificate as valid and revoke it, they can reject the
 CRL as invalid because it was signed by a revoked certificate, or
  they can crash...

Um, the real problem is not revoking the root certificate but what
other certificates to temporarily trust in the face of the revoked
root certificate (In the past, I have chosen the simplest to code
option of none but with the knowledge that things might break).

In a CRL that contains an element that revokes the CRL signing
certificate, only that element can be assumed to be correct. All other
list elements are suspect.

If a self-signed CA certificate lands in that CA's CRL, then, of
course the self-signed certificate can now be considered compromised.
Either the original private key signed the CRL or the compromising
copy signed it - both cases mean the root private key must be
considered compromised. Of course, the second case means some
malicious entity wanted to crash some piece of code that crashes in
this case. I can't think of another reason the malicious entity would
out themselves other than crashing buggy code.

All other elements in that CRL, and, indeed, all CRLs dating back to
the time of the compromise, might be invalid CRL elements. Code I have
written in the past assumed those certificates were invalid even
though they might not be. This was with full knowledge of the possible
but unlikely denial-of-service attack (there are so many better things
one can do with a compromised CA key then issue bad CRLs). Any
CRL-based DoS attack doesn't need to last too long because the CA can
issue new certificates signed with a new key in short order - getting
the new certificates including the new root certificate distributed,
of course, can take more time.

-Michael Heyman
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-08 Thread dan

Peter Gutmann writes:
-+---
 | This means that once a particular signed binary has been detected
 | as being malware the virus scanner can extract the signing
 | certificate and know that anything else that contains that
 | particular certificate will also be malware, with the certificate
 | providing a convenient fixed signature string for virus scanners
 | to look for.
 |

One would assume that the effort to get such a signing
certificate would persuade the bad team to use that cert
for targeted attacks, not broadcast ones, in which case
you would be damned lucky to find it in a place where you
could then encapsulate it in a signature-based protection
scheme.

--dan

good reading:
Cormac Herley,
The Plight of the Targeted Attacker in a World of Scale
http://research.microsoft.com/pubs/132068/TargetedAttacker.pdf

___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-08 Thread Peter Gutmann
d...@geer.org writes:

One would assume that the effort to get such a signing certificate would 
persuade the bad team to use that cert for targeted attacks, not broadcast 
ones, in which case you would be damned lucky to find it in a place where you
could then encapsulate it in a signature-based protection scheme.

My post was based on data gathered by a well-known anti-malware company, I'm 
just reporting what they found in real-world use.

In any case getting signing certs really isn't hard at all.  I once managed it 
in under a minute (knowing which Google search term to enter to find caches of 
Zeus stolen keys helps :-).  That's as an outsider, if you're working inside 
the malware ecosystem you'd probably get them in bulk from whoever's dealing 
in them (single botnets have been reported with thousands of stolen keys and 
certs in their data stores, so it's not like the bad guys are going to run out 
of them in a hurry).

Unlike credit cards and bank accounts and whatnot we don't have price figures 
for stolen certs, but I suspect it's not that much.

Peter.
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-07 Thread William Whyte
Cute scenario!

I would say that you shouldn't *install* signed software after the signing
cert expires, but if you installed it before expiry it's still safe to use
it.

In general, you shouldn't act based on a certificate if you don't know
it's trustworthy (obviously), but the action in question here is
installing the software, not running it.

Cheers,

William

-Original Message-
From: cryptography-boun...@randombit.net
[mailto:cryptography-boun...@randombit.net] On Behalf Of Peter Gutmann
Sent: Wednesday, December 07, 2011 7:02 AM
To: cryptography@randombit.net
Subject: [cryptography] How are expired code-signing certs revoked?

Consider the following scenario:

1. Attackers steal a code-signing key and use it to sign malware.
2. The certificate for the stolen key expires.
3. Malware signed with the key turns up.

Since the signature is timestamped to allow it to still validate after the
original cert expires, it'll be regarded as valid.  Since the cert has now
expired, it won't be present in the CRL, or if it was present it'll be
removed (this is standard practice to manage CRL sizes).

How do you invalidate such a signature?

Peter.
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-07 Thread William Whyte
Well, I think the theoretically correct answer is that you *should*...
these days all the installers can be available online, after all.

William

-Original Message-
From: Peter Gutmann [mailto:pgut...@cs.auckland.ac.nz]
Sent: Wednesday, December 07, 2011 9:21 AM
To: cryptography@randombit.net; pgut...@cs.auckland.ac.nz;
wwh...@securityinnovation.com
Subject: RE: [cryptography] How are expired code-signing certs revoked?

William Whyte wwh...@securityinnovation.com writes:

I would say that you shouldn't *install* signed software after the
signing cert expires, but if you installed it before expiry it's still
safe to use it.

That wouldn't work, consider the untold numbers of install CDs shipped
with anything that you could think of conneting to a PC at some point
(your shiny new digital camera, your electric toothbrush, ...).  These are
often extremely out-of-date, but you can't block the install just because
the cert has expired.

Peter.
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-07 Thread dan

Another wrinkle, at least as a logic problem, would be
whether you can revoke the signing cert for a CRL and
what, exactly, would that mean -- particularly if the
last known good date is well astern and hence the
revocation would optimally be retroactive.

--dan, quite possibly in a rat hole

___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-07 Thread Jon Callas
There are many things about code signing that I don't think I understand.

I think that code-signing is a good thing, and that all things being equal, 
code-signing is a good thing, and that code should be signed.

However, there seems to strange, mystical beliefs about it.

As an example, there's the notion that if you have signed code and you revoke 
the signing key (whatever revoke means, and whatever a key is) then the 
software will automagically stop working, as if there's some sort of quantum 
entanglement between the bits of the code and the bits of the key, and 
invalidating the key therefore invalidates the code.

This seems to me to be daft -- I don't see how this *could* work in a general 
case against an attacker who doesn't want that code to stop working (and that 
attacker could be either a malware writer or the owner of the computer). I can 
see plenty of special cases where it works, but it is fundamentally not 
reliable and a security system that wants to stop malware or whatever by 
revoking keys is even less reliable because we now have three or four parties 
(malware writer, machine owner, certifier, anti-virus maker).

It also seems to me that discussions on this list hit this situation from two 
strange directions. One is the general sneering at the daft belief. The other 
is continuing to discuss it. I don't care who is using it (even effectively); 
we're all smart enough to know both that DRM cannot work, and yet there are 
users of it that are happy with it. Whatever.

Slightly tangential to this is a discussion of expiration of signing keys. In 
reality, they don't expire. Unless you you make a device that can be 
permanently broken by setting the clock forward (which is certainly possible, 
merely not desirable), then expiry can be hacked around. The rough edge of what 
happens to code that expires while it is executing generalizes out to a set of 
other problems that just show that in fact, you can't really expire a code 
signing key any more than you can revoke it -- that is to say there are many 
edge conditions in which it works and many of these are useful to some people 
and some circumstances, but in the general case, it doesn't and cannot work.

But that doesn't mean that code signing is a bad thing. On the contrary, code 
signing is very useful because you can use the key, the signature, or the hash 
as a way to detect malware and form a blacklist, as well as detect software 
that should be whitelisted.

Simply stated, an anti-malware scanner can detect (and remove) a specific piece 
of malware by the simple technique of comparing its signature to a blacklist. 
It can compare a single object's hash to a list of hashes and that only 
requires the scanner to hash the code object; this catches the simple case of 
malware that is merely re-signed with a new key. It also permits it to do more 
complex operations than a simple hash (like hashing pieces, or hash at 
different times) to identify a piece of malware. It can also use the key to 
detect whole classes of malware (or good-ware).

Code signing is good because it gives the anti-malware people a set of tools 
that augment what they have with some easy, fast, effective ways to categorize 
software as known goods or known bads. 

But that's it -- you don't get the spooky action at a distance aspects that 
some people think you can do with revocation. You get something close, if you 
feed the blacklist/whitelist information to whatever the code-scanner is. 
Nonetheless, this answers how you deal with signed malware (once it's known to 
be malware, you stop it via signature), or bogus 512-bit signing keys (just 
declare anything signed by such to either be treated as malware or as unsigned).

So am I missing something? I feel like I'm confused about this discussion 
because *of* *course* you can't revoke a key and have that magically transmit 
to software. Perhaps some people believe that daft notion and have built 
systems that assume that this is true. So what? Maybe it works for them. The 
places where it doesn't work aren't even interesting. Perhaps observing when 
this daft notion meets the real world is helpful as an object lesson. Perhaps 
it works for *them* but not *us*.

But really, I think that code signing is a great thing, it's just being done 
wrong because some people seem to think that spooky action at a distance works 
with bits.

Jon

___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-07 Thread Steven Bellovin

On Dec 7, 2011, at 11:31 23AM, Jon Callas wrote:
 
 
 But really, I think that code signing is a great thing, it's just being done 
 wrong because some people seem to think that spooky action at a distance 
 works with bits.


The question at hand is this: what is the meaning of expiration or revocation
of a code-signing certificate?  That I can't sign new code?  That only affects
the good guys.  That I can't install code that was really signed before the
operative date?  How can I tell when it was actually signed?  That I can't
rely on it after the specified date?  That would require continual resigning
of code.  That seems to be the best answer, but the practical difficulties
are immense.


--Steve Bellovin, https://www.cs.columbia.edu/~smb





___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-07 Thread William Whyte
  But really, I think that code signing is a great thing, it's just
being done
 wrong because some people seem to think that spooky action at a distance
 works with bits.

 The question at hand is this: what is the meaning of expiration or
revocation
 of a code-signing certificate?  That I can't sign new code?  That only
affects
 the good guys.  That I can't install code that was really signed before
the
 operative date?  How can I tell when it was actually signed?  That I
can't
 rely on it after the specified date?  That would require continual
resigning
 of code.  That seems to be the best answer, but the practical
difficulties
 are immense.

I would say that you shouldn't install code signed with an expired
certificate,
but you can continue to use code you've already installed after the
certificate
expires. That requires occasional resigning, rather than continual.

Revocation is a different issue; it seems like it should require either
spooky
action at a distance or TTP timestamping (as ianG discussed earlier). In
the
absence of timestamping, if a code signing cert is revoked, the most
sensible
thing to do is uninstall any software signed with that cert.  This
obviously means
that even expired certs need to appear on a CRL, which raises the question
of
whether it's worth expiring them. I think it probably is worth it because
(a) it raises
the possibility that one cert's worth of software could be withdrawn
without
requiring all software from that vendor to be compromised, although at
that point
you're into marginal gains in very hypothetical situations; and (b) it
makes going out
of business cleaner and doesn't leave too many still-valid certs floating
around.

William
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-07 Thread Steven Bellovin

On Dec 7, 2011, at 12:34 29PM, Jon Callas wrote:

 
 On 7 Dec, 2011, at 8:52 AM, Steven Bellovin wrote:
 
 
 On Dec 7, 2011, at 11:31 23AM, Jon Callas wrote:
 
 
 But really, I think that code signing is a great thing, it's just being 
 done wrong because some people seem to think that spooky action at a 
 distance works with bits.
 
 
 The question at hand is this: what is the meaning of expiration or revocation
 of a code-signing certificate?  That I can't sign new code?  That only 
 affects
 the good guys.  That I can't install code that was really signed before the
 operative date?  How can I tell when it was actually signed?  That I can't
 rely on it after the specified date?  That would require continual resigning
 of code.  That seems to be the best answer, but the practical difficulties
 are immense.
 
 I want to say that the answer is mu because you can't actually revoke a 
 certificate. That's not satisfying, though.

It's certainly one possible answer, and maybe it's the only answer.  For now, 
though, I'd like to assume that there can be some meaning but I at least don't 
know what it is.
 
 I think it is a policy question. If I were making a software development 
 system that used certificates with both expiration dates and revocation, I 
 would check both revocation and expiry. I might consider it either a warning 
 or an error, or have it be an error that could be overridden. After all, how 
 can you test that the revocation system on the back end works unless you can 
 generate revoked software?

I'm not sure what you mean.
 
 On a consumer-level system, I might refuse to install or run revoked 
 software; that seems completely reasonable. Refusing to install or run 
 expired software is problematic -- the thought of creating a system that 
 refuses to work after a certain date is pretty creepy, and the workaround is 
 to set the clock back. 

Yup.  In fact, it's more than creepy, it's an open invitation to Certain 
Software Vendors to *enforce* the notion that you just rent software.
 
 But really, it's a policy question that needs to be answer by the creators of 
 the system, not the crypto/PKI people. We can easily create mechanism, but 
 it's impossible to create one-size-fits-all policy.
 
Right now, I'm speaking abstractly.  I'm not concerned with current PKIs or 
pkis or business models or what have you.  If you'd prefer, I'll rephrase my 
question like this: Assume that there is some benefit to digitally-signed code. 
 (Note carefully that I'm not interested in how the recipient gets the 
corresponding public key -- we've already had our PKI is evil discussion for 
the year.)  Given that there is a non-trivial probability that the private 
signing key will be compromised, what are the desired semantics once the user 
learns this.  (Again, I'm saying nothing about how the user learns it -- CRLs 
or OSCP or magic elves are all (a) possible and (b) irrelevant.)  If the answer 
is it depends, on what does it depend?  Whose choice is it?  

Let's figure out what we're trying to accomplish; after that, we can try to 
figure out how to do it.


--Steve Bellovin, https://www.cs.columbia.edu/~smb





___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-07 Thread Peter Gutmann
d...@geer.org writes:

Another wrinkle, at least as a logic problem, would be whether you can revoke
the signing cert for a CRL and what, exactly, would that mean

That's actually a known problem (at least to PKI people).  So what you're
really asking is whether a self-signed root cert can revoke itself, since a
lower-level cert can always be revoked by a higher-level one:

  The handling of CA root certificates is particularly problematic because
  there's no effective way to replace or revoke them.  Consider what would be
  required to revoke a CA root certificate.  These are self-signed, which
  means that the certificate would be revoking itself.  In the presence of
  such a revocation applications can react in one of three ways: they can
  accept the CRL that revokes the certificate as valid and revoke it, they can
  reject the CRL as invalid because it was signed by a revoked certificate, or
  they can crash (and some applications will indeed crash in this situation).
  Since revocation of a self-signed certificate is the PKI version of
  Epimenedes paradox All Cretans are liars and PKI applications are unlikely
  to be coded to deal with self-referential paradoxes, crashing is a perfectly
  valid response.

--dan, quite possibly in a rat hole

No, not really, the PKI folks have it sorted out: Ostrich algorithm, like many
other known paradoxes and problems created by the standards

Peter.

___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-07 Thread Peter Gutmann
Marsh Ray ma...@extendedsubset.com writes:

Originally, public key systems were said to possess deliver this property of
'nonrepudiation', meaning a digital signature could effectively authenticate
the intent of the party associated with the private key.

Uhh, they were never said to deliver this property by anyone who knew anything
about law, they were simply declared to have it by mathematicians and
standards committees:

  The term repudiation has a specific legal meaning but this has nothing to
  do with the use of the term in certificates, and there seems to have been
  little to no input from lawyers into the PKI standards that were meant to be
  used for digital signatures (it's always amusing watching heated arguments
  in standards groups over what both sides think that lawyers might advise if
  they actually asked them).  In particular, disabusing geeks of the notion
  that what's referred to in crypto/PKI theory as nonrepudiation actually
  means anything in a real-world legal context is really, really hard.  Geeks
  really want to believe in the magic of cryptography.

In recognition of this, X.509 some years ago stopped even pretending that
digital signatures provided nonrepudiation.  The certificate flag that used to
be nonRepudiation is now called contentCommitment to indicate it's for a long-
term signature, while digitalSignature is for a short-term signature like
authenticating for an online service.

(There's a lot more to the NR/CC saga than that, very few implementers seem to
have got the memo about NR = CC and everyone just uses digitalSignature for
everything, see the magic of cryptography comment in the excerpt above).

Peter.

___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-07 Thread Peter Gutmann
ianG i...@iang.org writes:

However, if one is relying on an external TTP to time-stamp the digital
signature, one can also rely on the TTP to evidence the hash of the document.
In which case, the digital signature is not performing any signing task
(although it may form a local authentication role, or may not, as the case
may be).  In this case the act of presenting the hash for external time-stamp
is the act of signing, not the act of affixing a digital signature.

You have to remember that using TSAs for code is used purely as a means of
getting around the fact that for CA billing purposes the code-signing cert
expires after one year.  You need to extend this in some PKI-compliant manner
in some way, and timestamping is the only obviously available way to do it.
If you want to extend it in a non-PKI-compliant manner you just ignore cert
expiration for code signatures, as the Java folks did when they removed any
certificate-expiry checks in JCE 1.2.2 after the expiry of the JCE 1.2.1
certificate on 27 July 2005 caused considerable pain for vendors of products
that used it.

Alongside TB2F CA certs, code-signing TSA certs are another example of
irrevocable certs, which would make them tempting targets for theft.  Unlike
CA roots, they have to be kept online at all times since they're used for
automated code-signing.  There are probably many more examples of certs in
similar too-critical-to-revoke certs that can't be protected as well as CA
roots.  How many can you spot?

Peter.

___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-07 Thread Peter Gutmann
Steven Bellovin s...@cs.columbia.edu writes:

Assume that there is some benefit to digitally-signed code.

There is at least one very obvious benefit: When malware is signed, it can't
mutate on each generation any more but has to remain static.  This makes it
easier for the anti-malware folks to detect.

You can also use it a second way:

  When malware authors have signed their products (at least until now) with
  fraudulently-obtained certificates (but not stolen ones) the only thing that
  they've signed with that particular certificate is malware. This means that
  once a particular signed binary has been detected as being malware the virus
  scanner can extract the signing certificate and know that anything else that
  contains that particular certificate will also be malware, with the
  certificate providing a convenient fixed signature string for virus scanners
  to look for.  This actually provides a real, effective use for code-signing
  certificates, although it's certainly one that the PKI folks would never
  have dreamed of.

  Unfortunately as with many other arms-race tricks it only works as long as
  the malware authors don't try to counter it, either by buying a new
  certificate for each piece of malware that they release (it's not as if
  they're going to run out of stolen credit cards and identities in a hurry)
  or by siphoning large numbers of benign applications from software-
  distribution sites, signing them, and re-uploading them to other software
  distribution sites so that the signed files that constitute actual malware
  get lost in the noise.

Let's figure out what we're trying to accomplish; after that, we can try to
figure out how to do it.

See above, code signatures help increase the detecability of malware, although
in more or less the reverse of the way that it was intended.

Peter.

___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-07 Thread Marsh Ray


   [Really this is to the list, not so much Jon specifically]

On 12/07/2011 02:10 PM, Jon Callas wrote:


Let's figure out what we're trying to accomplish; after that, we
can try to figure out how to do it.


I think that's the central problem we're dealing with. There is scads
of mechanism and little policy.

I also don't think we're going to agree on what policy should be,
except within limited contexts.


We've discussed CAs, PKI, liability, policy, etc.

But conspicuously absent in this discussion has been the Relying Party
(i.e., the end user) and their software vendor.

As weird as this sounds, the RP is the party with the ultimate control.
With the notable exception of DRM, it is the end user and the software
they selected to operate on their behalf who takes the bits from various
sources, drops them into this Rube Goldberg contraption, turns the
crank, and receives a slip of paper as output. At that point, it is up
to the user (in coordination with their software vendor) to behave
differently according to their interpretation of the result.

So I would like to differ a little bit with this statement:

On 12/07/2011 01:34 PM, ianG wrote:

Revocation's purpose is one and only one thing:  to backstop the
liability to the CA.


Maybe that's how it's design was originally motivated, but a facility
like revocation *is* precisely what users and their software vendors
make of it.

For example:

* There are operating systems that can and do apply regular updates on
root CAs and CRLs as part of their recommended regular patch channel.

* Microsoft implemented effectively CA pinning for certain Windows code
updates quite some time ago.

* A clueful Gmail user detected the otherwise-valid Iranian MitM cert
because Google implemented effectively CA pinning in Chrome, at least
for its own sites.

* Walled-garden app stores and DRM. Sure we all hate it and it's a
largely different threat model, but it's an example of something.

These examples have one thing in common: it is possible that something
can be widely deployed that's more effectively secure than we have now.

Yes, there will be difficulties. No, it will never be perfect. But boy
is there ever an opportunity for improvement.

It may upset some apple carts however, in particular one of my
favorites. It's called: Wow PKI is really busted, let's make popcorn
and watch the slow motion train wreck play out on the tubes.

But I find this especially ridiculous because I know for a fact that
there are people on this list who working for and directly advising
every part of PKI: the big browsers, other client software vendors,
secure websites, CAs, cypherpunks, academic cryptographers, end users,
you name it!

Moxie gets this, his convergence proposal talk has  33K views on
YouTube and he just sold his company to Twitter. What's up with that, hmm?

Google gets this, they have multiple proposals and implementation
projects going on for enhancements in this area. And they'll
nonchalantly deploy something into Chrome in some future unnumbered
update, Mozilla will follow soon after, and then the spec will be
submitted to IETF for copy editing.

CAs we will have with us always, but the current semantics of PKI
validation (trusted roots and spotty revocation checking) are on their
way out the door. Some products will rise, some will fall, some vendors
will feel some pressure, and yes even some users will get educated about
security in the process.

So, will you be making a contribution to the solution?

- Marsh
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-07 Thread Steven Bellovin

On Dec 7, 2011, at 4:56 29PM, Peter Gutmann wrote:

 Steven Bellovin s...@cs.columbia.edu writes:
 
 Let's figure out what we're trying to accomplish; after that, we can try to
 figure out how to do it.
 
 See above, code signatures help increase the detecability of malware, although
 in more or less the reverse of the way that it was intended.
 
I meant by canceling the key (I'm trying to avoid using the word revocation),
not by signing code.


--Steve Bellovin, https://www.cs.columbia.edu/~smb





___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-07 Thread Marshall Clow
On Dec 7, 2011, at 1:56 PM, Peter Gutmann wrote:

 Steven Bellovin s...@cs.columbia.edu writes:
 
 Assume that there is some benefit to digitally-signed code.
 
 There is at least one very obvious benefit: When malware is signed, it can't
 mutate on each generation any more but has to remain static.  This makes it
 easier for the anti-malware folks to detect.

This is only true if signing the malware is an expensive (in some terms) 
proposition.
It's certainly not expensive in terms of computing power.

-- Marshall

Marshall Clow Idio Software   mailto:mclow.li...@gmail.com

A.D. 1517: Martin Luther nails his 95 Theses to the church door and is promptly 
moderated down to (-1, Flamebait).
-- Yu Suzuki

___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-07 Thread Jon Callas

On 7 Dec, 2011, at 11:34 AM, ianG wrote:

 
 Right, but it's getting closer to the truth.  Here is the missing link.
 
 Revocation's purpose is one and only one thing:  to backstop the liability to 
 the CA.

I understand what you're saying, but I don't agree.

CAs have always punted liability. At one point, SSL certs came with a huge 
disclaimer in them in ASCII disclaiming all liability. Any CA that accepts 
liability is daft. I mean -- why would you do that? Every software license in 
the world has a liability statement in it that essentially says they don't even 
guarantee that the software contains either ones or zeroes. Why would 
certificates be any different?

I don't think it really exists, not the way it gets thrown around as a term. 
Liability is a just a bogeyman -- don't go into the woods alone at night, 
because the liability will get you!

Jon

___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-07 Thread lodewijk andré de la porte
I'm afraid signing software is multiple levels of bullocks. Imagine a user
just clicking yes when something states Unsigned software, do you really
want to install?. Imagine someone working at either a software or a
signing company. Imagine someone owning a little bitty software company
that's perfectly legitimate and also uses the key to sign some of his
maleware.

Software signing isn't usable for regular end users, experienced users
already have hashes to establish integrity up to a certain level, guru's
and security professionals compile from source instead of trusting some
binary. And yes that does exclude hidden-source software, it's the only
sensible thing to do if you don't want trust but real security!

-Lewis

2011/12/7 Jon Callas j...@callas.org


 On 7 Dec, 2011, at 11:34 AM, ianG wrote:

 
  Right, but it's getting closer to the truth.  Here is the missing link.
 
  Revocation's purpose is one and only one thing:  to backstop the
 liability to the CA.

 I understand what you're saying, but I don't agree.

 CAs have always punted liability. At one point, SSL certs came with a huge
 disclaimer in them in ASCII disclaiming all liability. Any CA that accepts
 liability is daft. I mean -- why would you do that? Every software license
 in the world has a liability statement in it that essentially says they
 don't even guarantee that the software contains either ones or zeroes. Why
 would certificates be any different?

 I don't think it really exists, not the way it gets thrown around as a
 term. Liability is a just a bogeyman -- don't go into the woods alone at
 night, because the liability will get you!

Jon

 ___
 cryptography mailing list
 cryptography@randombit.net
 http://lists.randombit.net/mailman/listinfo/cryptography

___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-07 Thread Marsh Ray


On 12/07/2011 07:01 PM, lodewijk andré de la porte wrote:

I figured it'd be effective to create a security awareness group
figuring the most prominent (and only effective) way to show people
security is a priority is by placing a simple marking, something like
 this site isn't safe!


I thought the international symbol for that was already agreed upon:
goatse.cx


On 12/07/2011 07:13 PM, lodewijk andré de la porte wrote:

I'm afraid signing software is multiple levels of bullocks. Imagine a
 user just clicking yes when something states Unsigned software, do
you really want to install?.


You're just thinking of a few code signing schemes that you have direct 
experience with.


Apple's iPhone app store code signing is far more effective for example.


Imagine someone working at either a
software or a signing company. Imagine someone owning a little bitty
software company that's perfectly legitimate and also uses the key to
sign some of his maleware.


His own malware? With his own certificate? How dumb can he be?


Software signing isn't usable for regular end users, experienced
users already have hashes to establish integrity up to a certain
level, guru's and security professionals compile from source instead
of trusting some binary. And yes that does exclude hidden-source
software, it's the only sensible thing to do if you don't want trust
but real security!


A scandal broke just the other day when http://download.cnet.com/ was 
found to be trojaning downloaded executables in their custom download 
manger wrapper. Just to be helpful, this wrapper would change your home 
page to Microsoft, change your search engine to Bing, and install a 
browser toolbar that did lord knows what other helpful stuff if you were 
dumb enough to click the Yes please install the helpful thing I 
downloaded button. After the find their PC filled with crapware, users 
likely attribute it to the poor unsuspecting developer of the legitimate 
application they'd intended to download.


Even the simplest code signing mechanism at least prevent application 
installers from being corrupted by commercial distribution channels like 
that. But only IF enough users were given a security justification for 
insisting on a valid signature on the installers that CNET would 
recognize that that kind of sleazy practice it would harm their brand.



http://download.cnet.com/8301-2007_4-57338809-12/a-note-from-sean-regarding-the-download.com-installer/


MS Windows 8 is said to be introducing an app store distribution channel.

- Marsh
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-07 Thread lodewijk andré de la porte
I'm afraid far more effective just doesn't cut it. Android has install
.APK from third party sources which you'll engage whenever you install an
APK without using the market, trusted or not. You can just put you malware
on the market though, they can then pull it back off 'n all but just
package it in Sexy asian girls #1283 and the like with different accounts
everytime. You're still in a bit of a sandbox though, can't help that
(although some do say it's not worth that much).
The appstore has heavy code review (so they say) that'd prevent malware
from entering the appstore, so far so good, it also prevents some
legitimate and a whole lot of questionable stuff. So people invented Cydia.
I never used it and I sure as hell didn't check it security features, but I
think you see where this is going.

Naturally, as with all security, implementation matters a lot. I'm not
saying I'm against signing stuff, I'm just saying I don't think it ever
helped me.

Op 8 december 2011 02:54 schreef Marsh Ray ma...@extendedsubset.com het
volgende:


 On 12/07/2011 07:01 PM, lodewijk andré de la porte wrote:

 I figured it'd be effective to create a security awareness group
 figuring the most prominent (and only effective) way to show people
 security is a priority is by placing a simple marking, something like
  this site isn't safe!


 I thought the international symbol for that was already agreed upon:
 goatse.cx



 On 12/07/2011 07:13 PM, lodewijk andré de la porte wrote:

 I'm afraid signing software is multiple levels of bullocks. Imagine a
  user just clicking yes when something states Unsigned software, do
 you really want to install?.


 You're just thinking of a few code signing schemes that you have direct
 experience with.

 Apple's iPhone app store code signing is far more effective for example.


  Imagine someone working at either a
 software or a signing company. Imagine someone owning a little bitty
 software company that's perfectly legitimate and also uses the key to
 sign some of his maleware.


 His own malware? With his own certificate? How dumb can he be?


  Software signing isn't usable for regular end users, experienced
 users already have hashes to establish integrity up to a certain
 level, guru's and security professionals compile from source instead
 of trusting some binary. And yes that does exclude hidden-source
 software, it's the only sensible thing to do if you don't want trust
 but real security!


 A scandal broke just the other day when http://download.cnet.com/ was
 found to be trojaning downloaded executables in their custom download
 manger wrapper. Just to be helpful, this wrapper would change your home
 page to Microsoft, change your search engine to Bing, and install a browser
 toolbar that did lord knows what other helpful stuff if you were dumb
 enough to click the Yes please install the helpful thing I downloaded
 button. After the find their PC filled with crapware, users likely
 attribute it to the poor unsuspecting developer of the legitimate
 application they'd intended to download.

 Even the simplest code signing mechanism at least prevent application
 installers from being corrupted by commercial distribution channels like
 that. But only IF enough users were given a security justification for
 insisting on a valid signature on the installers that CNET would recognize
 that that kind of sleazy practice it would harm their brand.

  http://download.cnet.com/8301-**2007_4-57338809-12/a-note-**
 from-sean-regarding-the-**download.com-installer/http://download.cnet.com/8301-2007_4-57338809-12/a-note-from-sean-regarding-the-download.com-installer/


 MS Windows 8 is said to be introducing an app store distribution channel.

 - Marsh

___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-07 Thread Marsh Ray

On 12/07/2011 08:12 PM, lodewijk andré de la porte wrote:

I'm afraid far more effective just doesn't cut it. Android has
install .APK from third party sources which you'll engage whenever you
install an APK without using the market, trusted or not.


That's why I didn't use Android as an example.

I said Apple's iPhone app store code signing is far more effective.

- Marsh
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-07 Thread Peter Gutmann
Marsh Ray ma...@extendedsubset.com writes:

Apple's iPhone app store code signing is far more effective for example.

The effectiveness of that isn't the PKI or the signing though, it's that Apple
vets the apps before allowing them in the store.  You don't need certs, all you 
need to do is have Apple sign the apps with a key that's burned into the 
iPhone.  PKI in this case just gets in the way.  Heck, you don't even need 
signatures, just have the iPhone contact Apple and say I just got fed 
something with this hash, is it OK?.

(Due to a confusion over certs, you couldn't until recently even verify the
signatures yourself because Apple published a different cert on their web site
than the one they used for signatures.  But in any case any use of PKI, and
possibly even signatures, in this case is just unnecessary complexity).

Peter.
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-07 Thread Nico Williams
On Wed, Dec 7, 2011 at 8:12 PM, lodewijk andré de la porte
lodewijka...@gmail.com wrote:
 I'm afraid far more effective just doesn't cut it. Android has install
 .APK from third party sources which you'll engage whenever you install an
 APK without using the market, trusted or not. You can just put you malware
 on the market though, they can then pull it back off 'n all but just package
 it in Sexy asian girls #1283 and the like with different accounts
 everytime. You're still in a bit of a sandbox though, can't help that
 (although some do say it's not worth that much).

You misunderstand.  The Android code signing model isn't intended to
protect you from installing malware: it's intended to help Android a)
provide isolation between apps from different sources, b) protect your
apps from untrusted updates.

To protect you from initially installing or running malware requires
something other than code signing.  The most code signing can do to
protect you from initially installing malware is to limit you to
running software from trusted sources, but only if you're willing to
let someone else (e.g., Apple) decide who is trusted and who isn't.

Nico
--
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography


Re: [cryptography] How are expired code-signing certs revoked?

2011-12-07 Thread Peter Gutmann
Marshall Clow mclow.li...@gmail.com writes:

This is only true if signing the malware is an expensive (in some terms) 
proposition. It's certainly not expensive in terms of computing power.

The rate-limiting factor is how many certs you can steal, and how quickly.  The 
technology side doesn't even come into it.  So this is a valid measure, and 
will continue to be so, because you can't speed up the cert-stealing process.

It's the same with monetary fraud, the rate-limiting step there is how fast 
you can cash out the accounts.  Sure, your botnet has collected 50M accounts 
and associated authorisation information, but how fast can you cash them out?

Velocity limiting via computationally intractable means is one security 
measure that is universally effective and hard to bypass.

Peter.
___
cryptography mailing list
cryptography@randombit.net
http://lists.randombit.net/mailman/listinfo/cryptography