Re: Sun donates elliptic curve code to OpenSSL?

2002-09-24 Thread Greg Broiles

At 10:57 PM 9/24/2002 +0900, Noriyuki Soda wrote:

>Greg Broile wrote:
> > in the OpenSSL library, provided that the people who don't want to be
> > sued comply with a list of conditions:
> :
> > (2) don't modify Sun's code as provided by Sun, don't use only parts
> > of the donated code, and don't remove the license text from the code.
>
>I think this (2) is misunderstanding, or may lead misunderstand.
>
>For example, i) of 3) describes that modifed part of the code
>(i.e. modified since Sun's contribution) is not subject of the
>covenant. But licensee who modifies Sun's code can covenant to Sun
>about the unmodified part of the code.

How does that avoid liability for infringing Sun's patent rights in
the modified part of the code?

Do you have an alternate explanation for the terms of Sun's license?


--
Greg Broiles -- [EMAIL PROTECTED] -- PGP 0x26E4488c or 0x94245961



-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]



Re: Sun donates elliptic curve code to OpenSSL?

2002-09-22 Thread Greg Broiles

At 02:47 PM 9/21/2002 +1200, Peter Gutmann wrote:

>[EMAIL PROTECTED] writes:
>
> >Some of the OpenSSL developers are on this list. In case they are too 
> busy to
> >reply, below are some of the comments from the package:
>
>Could someone with legal know-how translate whatever it is this is saying into
>English?

Sun is promising not to sue people for patent infringement for using Sun's 
code as provided
in the OpenSSL library, provided that the people who don't want to be sued 
comply with
a list of conditions:

(1) they promise not to sue Sun for infringing any of their own patents 
which might
cover the use of the donated code

(2) don't modify Sun's code as provided by Sun, don't use only parts of the 
donated code,
and don't remove the license text from the code.


--
Greg Broiles -- [EMAIL PROTECTED] -- PGP 0x26E4488c or 0x94245961



-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]



Re: It's Time to Abandon Insecure Languages

2002-07-22 Thread Greg Broiles

At 12:50 PM 7/22/2002 -0400, [EMAIL PROTECTED] wrote:

>CERT is far from a comprehensive source of security bug reports. Does
>anyone have statistics of bug types for Bugtraq or Mitre's CVE?

The CVE data is available at <http://www.cve.mitre.org/cve/downloads/>;
a mechanical (e.g., string-based) search of the database for all reports
(2224 as of the data set from June 25, 2002) find 461 which mention the
string "buffer overflow" in their description.

For the 563 reports dated in 2001, 99 mentioned buffer overflows.

For the 88 reports published so far in 2002, 21 mentioned buffer overflows.

But - the CVE web pages specifically warn, "CVE is not designed like a 
vulnerability database, so searches for general terms like "Unix" or 
"buffer overflow" could give you incomplete or inaccurate results."


--
Greg Broiles -- [EMAIL PROTECTED] -- PGP 0x26E4488c or 0x94245961



-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]



Re: IP: SSL Certificate "Monopoly" Bears Financial Fruit

2002-07-18 Thread Greg Broiles

At 09:53 AM 7/11/2002 +0200, Stefan Kelm wrote:
> >
> > See <http://www.securityspace.com/s_survey/sdata/200206/certca.html> for
> > recent data re SSL certificate market share; Geotrust, at
>
>I sincerely doubt the numbers presented in this so-called
>"survey". How did they get to a number of only 91,136
>secure servers "across all domains"? There are a huge number
>of CAs, many of which offer certificates to the public
>(see http://www.pki-page.info/#CA). Even if most CAs will
>not have a significant market share those numbers would be
>different.

For another data point, see this Netcraft survey circa January 2001 -

<http://www.netcraft.com/surveys/analysis/https/2001/Jan/CMatch/certs.html>

.. it shows approx 108,000 secure servers (they don't total it, and I didn't
bother adding up all the CA's with 10 certs in use.)

Security Space's numbers for the same timeframe show that they found 58,117
servers - <http://www.securityspace.com/s_survey/sdata/200012/certca.html>.

I don't know if the difference means that, between Jan 2001 and Jun 2002,
Security Space has discovered the other 40,000 secure servers in use; or
if they always see a fraction of what Netcraft does. (Netcraft's current data
is available for a yearly subscription at 1200 UKP.)

What I find especially telling in the recent Security Space results is the 
breakdown by "validity" -

Valid: 17833
Self-signed: 5275
Unknown signer: 13348
Cert-host mismatch: 32536
Expired: 35071

.. so, less than 20% of the certificates that they find on SSL servers in 
use on the open Internet are functioning correctly as part of a PKI; even 
if we assume that every one of the self-signed and
unknown signer certs servers are participating in undocumented or private 
PKIs such that their details are
unavailable to surveys like this one, that's still only 40% of the visible 
SSL servers. The remaining 60% are apparently misconfigured or forgotten.


--
Greg Broiles -- [EMAIL PROTECTED] -- PGP 0x26E4488c or 0x94245961



-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]



Re: IP: SSL Certificate "Monopoly" Bears Financial Fruit

2002-07-10 Thread Greg Broiles

At 03:48 PM 7/10/2002 -0700, [EMAIL PROTECTED] wrote:
> --
>On 6 Jul 2002 at 9:33, R. A. Hettinga wrote:
> > Thawte has now announced a round of major price increases.  New
> > cert prices appear to have almost doubled, and renewals have
> > increased more than 50%.
>[...]
>Why is not someone else issuing certificates?

See <http://www.securityspace.com/s_survey/sdata/200206/certca.html> for 
recent data re SSL certificate market share; Geotrust, at 
<http://www.geotrust.com>, has 11% of the market, and appears (from their 
web pages; I haven't bought one) to be ready to issue SSL server certs 
without the torturous document review process which Verisign invented but 
Thawte managed to make simultaneously more intrusive and less relevant.


--
Greg Broiles -- [EMAIL PROTECTED] -- PGP 0x26E4488c or 0x94245961



-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]



Re: http://www.steganos.com/en/cng/

2002-05-23 Thread Greg Broiles

At 09:01 PM 5/22/2002 +0200, Axel H Horns wrote:

>http://www.steganos.com/en/cng/
>
>In view of its crypto properties, is "Steganos Crypt & Go" a usable
>alternative to PGP or GnuPG? Or is it snake oil?

I haven't used the software myself; but according to the webpage you 
mention, the software encapsulates messages in executable files which are 
to be run by the recipient.

I believe that model is fatally flawed, for a number of reasons (not 
necessarily ranked in order of
severity) -

1.  Platform independence - what if the executable won't execute on 
your recipient's system? The marketing material at 
<http://www.steganos.com/en/cng/Crypt%20and%20Go%20-%20Flyer_en.pdf> says 
that it only runs on Windows systems. What if you want to correspond with 
someone who uses a Mac, or a Unix workstation, or runs a non-Windows OS on 
their PC hardware? Will the messages be readable in 10 years, even on a 
Windows system?

2.  Private key encryption - it appears to use only private key 
encryption ("The recipient requires no special software, because Crypt & Go 
packages decode themselves after the password is entered.") This means that 
you've got to pre-arrange & manage keys to use with your correspondents, 
with all of the attendant hassles.

3.  Execution of unsolicited, unknown programs - if the recipient 
doesn't have special software, how do they know that the executable they 
received (a) is really from you, and (b) is what it purports to be? What if 
it's email sent by a virus like Klez? An incoming might be from a third 
party who had the two of you in his address book when s/he was infected. 
It's wildly irresponsible and reckless to run executables received 
unsolicited via email, which is exactly what Crypt-and-Go depends on. (In 
light of Klez and other email-forging viruses, it should be abundantly 
clear that it's not good enough to rationalize "well, I recognize the name 
of the person in the From: header, so I guess this is safe".)

Sure, a reasonable response to (3) is to install a virus scanner, and/or 
special crypto software which will authenticate the message before running 
the executable .. but if you've done that, you've abandoned the "no special 
software" marketing feature, and might as well just mail text documents 
back & forth, since you've got an authentication scheme you trust.

They say they use 128-bit AES - which sounds fine, if it's implemented 
appropriately - but even assuming a bulletproof AES implementation, the 
other aspects of the package make it, in my opinion, a danger to its users, 
who would be better served sending emails in the clear or faxes.


--
Greg Broiles -- [EMAIL PROTECTED] -- PGP 0x26E4488c or 0x94245961


-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]



Re: Shades of FV's Nathaniel Borenstein: Carnivore's "Magic Lantern"

2001-11-21 Thread Greg Broiles

At 10:40 AM 11/21/2001 -0500, [EMAIL PROTECTED] wrote:

>In the same vein, but a different application, does anyone know what
>the state of the art is for detecting such tampering?  In particular,
>when sitting at a PC doing banking, is there any mechanism by which a
>user can know that the PC is not corrupted with such a key logger?
>The last time I checked, there was nothing other than the various
>anti-virus software.

I have not used them, but you might find these of interest, all for Windows 
systems -

Spycop <http://spycop.com>
Hook Protect or PC Security Guard 
<http://www.geocities.com/SiliconValley/Hills/8839/utils.html>

I note that the latter URL loads a page which Bugnosis 
<http://www.bugnosis.org> identifies as containing possible "web bug" 
single-pixel images and complicated cookies.


--
Greg Broiles -- [EMAIL PROTECTED] -- PGP 0x26E4488c or 0x94245961
5000 dead in NYC? National tragedy.
1000 detained incommunicado without trial, expanded surveillance? National 
disgrace.




-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]



Re: First Steganographic Image in the Wild

2001-10-16 Thread Greg Broiles

At 11:43 PM 10/15/2001 +0100, Adam Back wrote:

>If you read the web page it was just a demo created by ABC news --
>that doesn't count as found in the wild.  Not that it would be that
>far out to find the odd image in the wild created as a novelty by
>someone tinkering with stego software, or perhaps even individuals
>playing with stego.
>
>Stego isn't a horseman, and the press drumming up scare stories around
>stego is ludicrous.  We don't need any more stupid cryptography or
>internet related laws.  More stupid laws will not make anyone safer.

I agree, but if Congress isn't careful (and they don't seem to be in a
careful mood these days), they'll end up outlawing watermarking in
digital "content", which would do to the DRM (digital rights management)
industry what they tried to do to security researchers with the DMCA.

Perhaps the RIAA and SDMI folks will now come out in favor of
steganography in order to save their businesses.

Or maybe they be forced to rewrite their complicated protection schemes
to enable "stego escrow", so that federal agents can monitor the secrets
hidden inside published content, to make sure there aren't any hidden
messages in Anthrax albums.


--
Greg Broiles
[EMAIL PROTECTED]
"We have found and closed the thing you watch us with." -- New Delhi street kids




-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]



Best practices/HOWTO for key storage in small office/home office setting?

2001-10-01 Thread Greg Broiles


Are list members aware of any helpful resources describing best practices 
or HOWTOs for protecting cryptographic keys in a small office/home office 
setting?

I'm aware of the following approaches, given the assumption that good 
physical security is unavailable -

1.  Store keys & etc on hard disk inside a laptop which is kept in a 
safe or similar when not in use
2.  Store keys & etc on -
 a.  hard disk in removable carrier
 b.  3.5" floppy/CD/CD-R[W]/Zip disk
 c.  PCMCIA hard disk
 d.  PCMCIA memory
 e.  Compact Flash hard disk
 f.  Compact Flash memory
 g.  Storage-only smartcard
 .. each of which are stored in safe when not in use
3.  Generate & use keys on crypto smartcard (like Schlumberger's 
Cryptoflex) which is stored in safe when not in use
4.  Generate & use keys in dedicated crypto processor board
5.  Generate & store or generate & use keys stored across network in 
encrypted form

Obviously, much of the above just rewrites a hard problem (protect this 
room) into an easier but not entirely solved problem (protect the interior 
of this safe); and it ignores security for the keys while in active use 
versus hostile or sloppy software which may be running on the host. It also 
ignores the use of keystroke recorders or visual/audio surveillance systems 
to gather content which is available outside of the crypto envelope/tunnel. 
I'm trying to come up with a list of things people can do to improve (not 
perfect) their security, with modest expenditures and a little bit of extra 
effort during operations.

Also, is anyone aware of a currently shipping crypto smartcard 
reader/card/driver bundle which integrates well with any flavor of PGP or 
S/MIME mail software? The only example I'm aware of is Litronic's NetSign 
bundle (Cryptoflex + serial card reader + MSIE/Netscape drivers for $99) 
which apparently doesn't support USB nor PGP.


--
Greg Broiles
[EMAIL PROTECTED]
"We have found and closed the thing you watch us with." -- New Delhi street kids




-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]



Re: Effective and ineffective technological measures

2001-07-29 Thread Greg Broiles

At 11:20 AM 7/29/2001 +0200, Alan Barrett wrote:

>The DMCA said:
> > 1201(a)(1)(A):
> >No person shall circumvent a technological measure that effectively
> >controls access to a work protected under this title.
>
>What does "effectively" mean here?

1201(b)(2)(B):

a technological measure ''effectively protects a right of a
copyright owner under this title'' if the measure, in the
ordinary course of its operation, prevents, restricts, or
otherwise limits the exercise of a right of a copyright owner
under this title.


--
Greg Broiles
[EMAIL PROTECTED]
"We have found and closed the thing you watch us with." -- New Delhi street kids




-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]



Fwd: Re: Crypographically Strong Software Distribution HOWTO

2001-07-07 Thread Greg Broiles

More from Rodney - I'm avoiding the "is law relevant?" branch of this 
thread because I think it's wandering off-topic, but can continue in 
private email if any of the participants think it's likely to be productive.

>Date: Sat, 07 Jul 2001 08:33:29 -0700
>To: Greg Broiles <[EMAIL PROTECTED]>
>From: Rodney Thayer <[EMAIL PROTECTED]>
>Subject: Re: Crypographically Strong Software Distribution HOWTO
>
>(I can't tell where the signal and where the noise is in this thread,
>so I'll just say this to you, feel free to forward.)
>
>PKIX and it's legacy ancestor, X.509, have a 'revocation reason'
>field in the CRL mechanism.  However, it's not used -- but then
>again Verisign and others don't really use revocation so that's not
>necessarily a good example.  It's more interesting to note that,
>when people try to ask about revocation reasons, it turns out
>there's little consensus (e.g. in the IETF community) on the need
>for revocation reason.
>
>I think this is because people haven't really tried to deploy these
>systems in a practical manner, rather than because of any architectural
>flaw.
>
>At 01:15 PM 7/3/01 -0700, you wrote:
>
>
>>Because current systems don't, to my knowledge, allow the creators of 
>>revocations to specify the reason(s) for revocation, I wonder if it would 
>>be better to rely on short-lived keys or certs which are renewed 
>>frequently during a person's membership or association with a group.
>
>

--
Greg Broiles
[EMAIL PROTECTED]
"Organized crime is the price we pay for organization." -- Raymond Chandler




-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]



Re: Crypographically Strong Software Distribution HOWTO

2001-07-03 Thread Greg Broiles

At 02:13 PM 7/3/2001 -0400, V. Alex Brennen wrote:

>In the case of such a large project, perhaps you could issue
>a separate role key pair to each developer and generate
>revocation certificates which are held by the core group for
>those keys. When a developer leaves the group, the revocation
>certificate for his key would be circulated.

Because current systems don't, to my knowledge, allow the creators of 
revocations to specify the reason(s) for revocation, I wonder if it would 
be better to rely on short-lived keys or certs which are renewed frequently 
during a person's membership or association with a group.

Specifically, a revocation which does not distinguish between "stopped 
working on the project because very busy at new job" and "left laptop with 
private key on public transit" or "discovered installed rootkit on machine 
storing unencrypted private key" does not help people decide whether they 
can reasonably install old(er) distributions of a software package. If the 
package was signed by a person who is no longer participating as a 
builder/distributor, their key should not be current - but that doesn't 
mean that everything which has been signed with their key should be 
considered untrusted, as it would be in the case of a key compromise.

In particular, consider the example of a person who's the last within a 
group to maintain a port/build/distribution on a hard-to-find platform, who 
then leaves the group - it may be difficult to find someone else to replace 
them, so new builds may not be available - but software which was once 
considered working and "official" shouldn't lose that status because of the 
change in group membership.

Certainly, in the best of all possible worlds, everyone who installs 
software would have access to online CRL and CA resources, and we wouldn't 
need to think about whether or not a particular snapshot of reality is 
misleading in an especially optimistic or pessimistic way - but I believe 
we should not design (only) for that world.

In the absence of semantic information about revocations, I think that 
expiration is a more appropriate model where no compromise is reasonably 
suspected, and that revocation is a more appropriate model where compromise 
is suspected or asserted.


--
Greg Broiles
[EMAIL PROTECTED]
"Organized crime is the price we pay for organization." -- Raymond Chandler




-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]



non-repudiation, was Re: crypto flaw in secure mail standards

2001-07-02 Thread Greg Broiles
 non-ludicrous non-repudiation is very expensive, and I don't think 
there'd be a lot of institutional resolve behind the "non-repudiation" rule 
of law if in fact the PKI system were misused - e.g., the IT guy gives 
everyone 3 months of vacation .. are they really going to pay everyone to 
stay home for 3 months? Are they really going to promote the IT guy to CIO 
and give him a corner office and a big raise, just because he got ahold of 
the Board of Directors' keyrings? Of course not.

So when, exactly, is it useful? As far as I can tell, it's a tradeoff of 
certainty for accuracy in decision-making - e.g., we'll have the legal 
system enforce judgements based on evidence which we all agree is (in some 
cases) false, but we think that's a better result than we'd get if we dug 
deeper into the facts, because we like speed and certainty over a slow, 
confusing litigation process.

>   It would also be fair
>to say it is more important today than it was 12 years ago, when PEM was
>first getting popular (for as popular as it got).  For that reason, Don
>can call it a "flaw" if he wants to, but I prefer to think of it as the
>"next bite" of the secure email problem which we could reasonably do
>something about; it's certainly not a hard problem technically or a huge
>oversight that got no attention at the time.

It sounds like the real problem is the marketing hype that's accompanied, 
and continues to accompany, PKI products - e.g., "buy this and you'll win 
all your court cases!" "nobody will be able to escape a contract with you!" 
"if people sign something, they'll have to perform!" "you can be sure of 
who you're talking to" - etc.

The flaw discussed here is an example of a case where a cryptographic 
product fails to live up to the expectations of its users, because they 
expected some sort of cryptographic magic to make sure that their 
communications aren't used in an unexpected fashion - and that flaw looks 
especially dangerous if we assume that there's also some sort of legal 
magic which means that people who have digital signatures win (or lose) in 
court automatically.

We can cure the mismatch between expectation and delivery by changing the 
expectation, or changing what's delivered, or both.

Immediately, we can and should change user and developer expectations - we 
should distingiush between means (like cryptography) and ends (like 
security, or reduced risk), and remember that good techniques reduce risk 
and increase security, but don't create a state of perfection.

We also need to pay attention to the use of terminology when crossing 
domains of expertise - e.g., computer science and computer security people 
make messes when they make assumptions about law (like the 
"non-repudiation" distraction, and, generally, confusion about the 
difference between evidence and legal arguments), and legal people make 
messes when they make assumptions about technology (like the ambiguity 
regarding ephemeral copies and copy ownership for computer software, as 
discussed in 17 USC 117(a)).

Eventually, maybe we will build a system of legal rules and technological 
artifacts and processes that interoperate to create, preserve, and 
represent evidence of facts which may be of importance - but we haven't 
done that yet, at least not in the US. I gather that the UK may have 
adopted some of the wilder "non-repudiation" ideas which have been 
suggested and I think it's a grave mistake - both for the technologists who 
proposed that sort of thing, and for the lawmakers who were foolish enough 
to accept the gambit.


--
Greg Broiles
[EMAIL PROTECTED]




-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]



Re: crypto flaw in secure mail standards

2001-06-24 Thread Greg Broiles

At 09:45 AM 6/24/2001 +0800, Enzo Michelangeli wrote:

>A question for legal experts on the list: Does all this pose legal risks
>within the current legal framework? In other word, do current digital
>signature laws assume that also the headers are assumed to be authenticated
>and non-repudiable if the message is digitally signed?

The digital signature laws I've seen don't mention and don't support the 
notion of "non-repudiation", which seems to be an obsession among computer 
security people and a non-issue among legal people. The idea that something 
is "non-repudiable" or unarguable or unavoidable is nonsense. I use it as a 
clue detector - if someone talks about non-repudiation, they don't know 
much about US contract law.

The attack raised - at least as it's been summarized, I haven't gotten 
around to the paper yet - sounds like a good one to remember, but too 
contrived to be especially dangerous in the real world today. How often do 
you, or people you know, send short context-free messages to conclude 
important negotiations? And how often would you rely on a digital signature 
to assure you that everything was kosher if an otherwise promising deal or 
negotiation suddenly turned bad? And if you thought you had grounds for a 
lawsuit, wouldn't you send a message or make a phone call first, to the 
effect of "I was really surprised that you ended our discussion so 
abruptly. I understood our agreement to require you to continue to supply 
me with widgets for the next 3 years. If you're serious about ending our 
relationship early, I'm going to have to talk to my lawyer about that, 
because you've put me at a serious disadvantage, now that the spot price of 
widgets has gone up so much."

Sure, let's work on this and make systems better, so that signatures 
include context which helps prevent misunderstanding or active attack. But 
the sky isn't falling - this attack is a nuisance, becuase it makes its 
victims spend a few hours on the phone ironing out a misunderstanding - and 
it's not at all likely to lead to serious lawsuits.

I just ran across Jon Callas' earlier message in this thread and think he's 
right on the money. Don't sign tiny no-context messages. Don't get 
distracted by the cartoonish fantasy of non-repudiation.


--
Greg Broiles
[EMAIL PROTECTED]
"Organized crime is the price we pay for organization." -- Raymond Chandler




-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]



Re: Cryptobox (was Re: Edupage, June 20, 2001)

2001-06-21 Thread Greg Broiles

At 02:36 PM 6/21/2001 +0100, R. A. Hettinga wrote:

> > PRIVATE LIFE
> > Researchers at Ottawa University are developing Cryptobox, a
> > program that encrypts e-mail, instant messages, and other Internet
> > communications. The program works by sending transmissions over
> > a peer-to-peer network, scrambling each end of the transmission
> > with an encryption code and hiding it underneath a stream of junk
> > traffic. The system automatically decodes the transmissions once
> > they reach their destinations. The researchers have already
> > tested Cryptobox in a network of 40 real and 200 virtual clients
> > and report that the test succeeded. Independent researchers are
> > skeptical, however. Richard Clayton, a computer scientist at
> > Cambridge University, noted, "It's unclear whether they can make
> > this work and keep it stable in the real world with millions of
> > systems." The program could, if successful on a large scale,
> > solve one of the main security vulnerabilities of the Internet.
> > Currently, e-mails, instant messages, and many other transmissions
> > can be easily intercepted by those with access to key areas of a
> > network.
> > (New Scientist Online, 18 June 2001)

The system has been discussed some on InfoAnarchy - 
<http://www.infoanarchy.org/?op=displaystory;sid=2001/6/11/144219/372>

It looks a lot like the principal designer(s) are unfamiliar with previous 
work on MIXes and Crowds, and haven't addressed the collusion-based attacks 
described in the literature. They also seem to believe they've got 
something called "compromised client detection" which prevents collusion 
through the use of digital certificates (!).

They're unwilling to release current source code or documentation, because 
they're planning to patent some aspects of their work; they've also said 
that the software will be released under the GPL and/or the LGPL.

Their design documents will apparently be available for review and comment 
after the implementation is finished.

--
Greg Broiles
[EMAIL PROTECTED]
"Organized crime is the price we pay for organization." -- Raymond Chandler




-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]



Re: Lie in X.BlaBla...

2001-06-05 Thread Greg Broiles

At 10:51 AM 6/4/2001 +0800, Enzo Michelangeli wrote:
>[...]
>OK, so excuse me for being dense, but how exactly can such attack be
>perpetrated? No sane authorization system bases authentication on the mere
>presentation of the certificate: one must have the corresponding private
>key.

Laws are not generally written to require "sane" configurations on the part 
of victims in order to achieve a successful prosecution. In particular, 
laws may be especially useful or helpful where potential victims are unable 
to or unlikely to provide their own effective security. If they could 
provide that security, the law would be unnecessary, or at least 
uninteresting and redundant. It's where people can't provide for their own 
security with "sane" systems or configurations that law is needed.

Lest you think that's just philosophical/academic noodling, don't forget 
that Microsoft is based in Washington, and that Microsoft has historically 
had a difficult time with secure design, secure implementations, and secure 
operations. Further, Microsoft has already built (and experienced 
embarrassment and insecurity) regarding the operation of a PKI-based 
code-signing hierarchy, and is going deeper, farther down that road with 
Win2K and XP and all of the rest. It's my understanding that every app in 
an XP system must be signed by a key that's been certified by Microsoft as 
a code-signing key, though I don't think I'm particularly up to date on the 
specifics of that security model. Also, remember that Microsoft's .NET and 
Hailstorm initiatives also depend on the security of an operating CA/PKI 
subsystem, and that they'll likely be hosted (at least in part) in Washington.

If your argument is "nobody will ever build an insecure system, be tricked 
into issuing a bad cert, and then want a big stick to use to go after the 
person who got the cert", you should meditate on the Microsoft situation 
for awhile. I don't have specific knowledge of Microsoft involvement in the 
drafting and passage process for this statute, but I'd be wildly surprised 
if they weren't involved at some level, simply because of their dominant 
position vis-a-vis the WA economy and their position of respect on 
technical matters among less technical people. If they didn't have some 
in-house PKI smartypants talk to the drafters of this bill, at least 
informally, I'd say Microsoft isn't doing right by its shareholders.

>So, does this section of the law intend to punish misappropriation of
>private key? Fine, but then it could just say so, better if stating in
>general terms: "Using stolen or otherwise unlawfully obtained information to
>gain unauthorized access to information or engage in an unauthorized
>transaction". This would also cover other cases (is it perhaps OK to tamper
>maliciously with Kerberos tickets or to steal a login password?) also
>including non-electronic forms of identity theft.

I must admit I'm at a loss. A few days ago you were up in arms because this 
statute was too broadly drafted, such that it was going to sweep up many 
unsuspecting non-guilty people - now you're saying that you think it should 
have been written even more broadly, so that it reaches even non-electronic 
identity theft. Are you just generally opposed to the idea of the statute, 
and now fishing for a plausible argument to justify your initial opposition?

I still think the statute is a pretty reasonable attempt at prohibiting PKI 
fraud which is unlikely to pose a great danger to people who behave in a 
normal fashion (e.g., doing things that would be legal in a sane environment).

>Talking of which, theft and fraud are ALREADY offences, regardless
>of the context, and I see no point in creating additional statute.

I agree with you about this - even applied to this statute - but 
legislators think they've been elected to Do Something, and they do.


--
Greg Broiles
[EMAIL PROTECTED]
"Organized crime is the price we pay for organization." -- Raymond Chandler




-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]



Re: Lie in X.BlaBla...

2001-06-03 Thread Greg Broiles

At 08:53 PM 6/2/2001 -0700, [EMAIL PROTECTED] wrote:
> --
>te:
> > No server will ever fall afoul of the law, because servers aren't subject
> > to criminal liability. A person or an organization might fall afoul of the
> > law if they use a certificate server in a fraudulent way.
>
>The law defines the ordinary use of certificate servers as 
>fraudulent.  Yet another law making felons of us all.

It does no such thing. The law criminalizes the following -

(1) Knowingly misrepresenting one's identity or authorization to obtain a 
certificate which refers to a private key for creating signatures (Sec. 1(1))

(2) Knowingly forge a digital signature (Sec. 1(2)), which means -
 (a) creating a digital signature without the authorization of the 
rightful holder of the private key
 (b) creating a digital signature verifiable by a certificate 
listing as a subscriber a person who - (i) does not exist
 (ii) does not hold the private key corresponding to the 
public key listed in the certificate
 (RCW 19.34.020 (16))

(3) Knowingly present a certificate for which you are not the owner of the 
corresponding private key, IN ORDER TO OBTAIN UNAUTHORIZED ACCESS TO 
INFORMATION OR ENGAGE IN AN UNAUTHORIZED TRANSACTION. (Sec. 1(3), emphasis 
added because it's apparently common to stop reading halfway through that 
sentence)

Which of the above do you consider "ordinary"?

Which of those "makes felons of us all?"

I've been using PKI-based technology for a little over 8 years now, if I 
remember correctly, and can't remember ever needing to do any of (1)-(3) above.

Let's not turn this into another one of those "Postal service will charge 
$.25 per email! Write your senator!" net legends, ok?

I don't think the new law is necessary - it's basically a retread of 
existing fraud and computer misuse statutes - but I don't think it 
criminalizes anything that wasn't criminal before. I haven't spent a lot of 
time crawling through Washington's criminal code - nor criminal courts, 
where the rubber meets the road - so I don't know if the "felony" status 
for this is new, or meaningful, or exemplary - it sounds like overkill, to 
my ears, but so does much of what comes out of our federal and state 
legislatures so I've stopped thinking that's remarkable.

>I knowingly present certificates that are not my own all the time.

In order to obtain unauthorized access to information or engage in 
unauthorized transactions?

I knowingly use firearms and automobiles all the time, too - but I don't 
worry overmuch about laws which criminalize their misuse, because I'm not 
misusing them.

If your fear is that the "unauthorized" word is susceptible to later 
re-interpretation (as a factual matter, not as a legal matter - e.g., 
retroactively revoked permissions) - I agree that's a difficult issue, but 
this law doesn't modify an existing danger, because Washington has already 
criminalized (as a felony, in some cases) "gaining access" to a computer 
owned by another person "without authorization". (RCW 9A.52.110) I also 
note that inducing another to sign a written instrument under false 
pretenses is already a felony. (RCW 9A.60.030).

>In my observation, the way the law works is that they make a law that 
>criminalizes as many people as they can get away with, a dragnet law to 
>define the largest possible number of people as felons, and then they 
>apply that law only to certain people they do not like, and at first do 
>not apply the
>law to the vast majority of people who routinely break it.

I agree that this happens, and that it's bad, but this statute is too 
narrowly drawn to be much use in furtherance of that project.

>Obviously the intent is only to apply this law to pimply faced hackers, 
>just as the original intent of the drug laws was to apply only to blacks, 
>but eventually it will be applied to people like you and me.

If the " . . in order to obtain unauthorized access" language wasn't in 
section (3), I'd agree with you. But it's there, so I don't think this law 
presents a special danger, beyond the fact that it's referring to a new 
technology that's not necessarily well understood. I'd have preferred that 
the WA legislature wait another 5 or 10 years to see what turns out to be a 
real problem and what doesn't - but apparently they weren't inclined to. 
They've already got a statutory scheme at RCW 19.34 regarding certificate 
authorities and digital signatures; it doesn't seem surprising that they 
though it was appropriate to use criminal law to address misuse of or 
within that framework.


--
Greg Broiles
[EMAIL PROTECTED]
"Organized crrch.is the price we pay for organization." -- Raymond Chandler




-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]



Re: Lie in X.BlaBla...

2001-06-01 Thread Greg Broiles

At 09:58 AM 6/1/2001 +0800, Enzo Michelangeli wrote:
> > At 07:22 AM 5/31/2001 +0800, Enzo Michelangeli wrote:
> >
> > >Besides, it would be idiotic to grant access to information or
>authorization
> > >for a transaction to someone, just because he or she has presented a
>"public
> > >key certificate": authentication protocols require possession of the
>private
> > >key. Those legislators just don't know what they are talking about.
> > >Scary.
> >
> > The statute didn't say "just because" or describe a technical architecture
> > for an access control system - it criminalized the presentation of a
> > certificate without "owning" the corresponding private key.
>
>Uhm... So, which devious use of someone else's certificate were those guys
>trying to address? Also a bona fide certificate server could fall afoul of
>such law.

They were trying to address any fraudulent (not "devious") use of a 
certificate to gain access or information, without regard to the technical 
details.

No server will ever fall afoul of the law, because servers aren't subject 
to criminal liability. A person or an organization might fall afoul of the 
law if they use a certificate server in a fraudulent way. It is impossible 
to violate the WA law accidentally, because a conviction under the law 
requires that the convicted person act with the required mental state (the 
part that says "shall not KNOWINGLY", emphasis added). It is possible for a 
person to be careless with respect to what's been forbidden by the 
legislature, or cavalier with respect to what they believe is achievable by 
prosecutors, but that's not the same thing.

>In my experience, misguided laypeople build their attitude towards
>handling of certificates on the assumption that "a certificate is like a
>digital ID card". This sounds like one of those cases.

Have you considered that you might be making the same misguided assumptions 
about the law?


--
Greg Broiles
[EMAIL PROTECTED]
"Organized crime is the price we pay for organization." -- Raymond Chandler




-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]



Re: Lie in X.BlaBla...

2001-05-31 Thread Greg Broiles

At 07:22 AM 5/31/2001 +0800, Enzo Michelangeli wrote:

>Besides, it would be idiotic to grant access to information or authorization
>for a transaction to someone, just because he or she has presented a "public
>key certificate": authentication protocols require possession of the private
>key. Those legislators just don't know what they are talking about.
>Scary.

The statute didn't say "just because" or describe a technical architecture 
for an access control system - it criminalized the presentation of a 
certificate without "owning" the corresponding private key.

Matt's point about cert chains was apropos - and it's worth thinking for a 
minute about what it means to own a key, rather than simply possess a copy 
of it, as this seems to be creating a new kind of intellectual property, if 
there's such a thing as title to a keypair - but I don't think that the 
lack of specification of an authentication protocol in the statute implies 
that the legislature thinks there shouldn't be one, nor that any particular 
one should be used. I think they got this part of the statute just right. ( 
.. though I'm not sure it's time to start writing new laws for PKI)





-
The Cryptography Mailing List
Unsubscribe by sending "unsubscribe cryptography" to [EMAIL PROTECTED]