Re: We have always been at war with Oceania

2002-07-06 Thread Bill Stewart

The Indianapolis Star newspaper ran the NYTimes version of this story
with the headline Drug Peddling Pilots May Get Wings Clipped.
I was assuming it would be about revoking their pilots' licenses
or confiscating their airplanes, but no, it was about
shooting them down and machine-gunning any fleeing passengers,
like they did to the Baptist missionary and her baby last year.
Under Fujimori's dictatorship in Peru, they claim to have shot down
about 25 planes, and some politician made a highly-pleased-with-himself
statement about how it's really discouraged traffickers.

There may be True Believers on the pro-terrorist side,
but cocainistas are in it for money, not ideology -
they're happy to blow up the occasional judge, but they're smart enough
not to try going toe-to-toe with the US military in a shooting war.

At 09:39 AM 07/04/2002 -0700, Major Variola (ret) wrote:
http://story.news.yahoo.com/news?tmpl=storycid=536ncid=703e=1u=/ap/20020704/ap_on_go_pr_wh/bush_drug_flights_1

President Bush ( news - web sites) is expected to allow resumption of a
program to force down or
shoot down airplanes suspected of carrying drugs in Latin America,
a senior administration official
said Thursday.

Does Mr. Bush understand tit-for-tat?
Hasn't he figured out that he can bust all the petty hawalas he wants,
but the Cartels have *cash* to spend, an excellent distribution
network, and a few submarines?
Looking for True Believers speaking spanish...

Maybe the Saudis will start taking planes out for carrying ethanol...




Re: Jonathan Zittrain on data retention, an awful idea

2002-07-06 Thread Bill Stewart

Sigh.  Back when the US Feds were still trying to push Key Escrow on the
National Information Infrastructure, I started research for an April 1 RFC
for the National Information Infrastructure Data Entry Escrow Protocol, 
NIIDEEP,
and proposed NIIDEEP Information Network Protocol Implementation Government
Standard, NIIDEEPINPIGS.  (Didn't get it finished by April 1 :-)

Because after all, there's no sense escrowing our crypto keys if you
don't also escrow the cyphertext - some of Eric Hughes's talks on
Message Escrow and Data Retention Policies were excellent explanations
of why those was the critical issues.

If the US Feds and Eurocrats would like us to provide them with all of our 
data,
the existing internet would need to double in size to transport all of it
to the appropriate government data storage facilities, or more than double
if separate copies need to be provided to multiple national governments,
or much more if local governments such as city trade commissions need copies.
Since this is clearly unrealistic, even with the demise of the E-Bone,
transmission will require the use of off-line data transmission technology.
Waiting for approval of new government standards would take too long, and
lose access to valuable data because of the resulting delay of several years,
but there are several existing standards for data storage that can be applies.

My long-lost research had the FIPS (US Federal Information Processing 
Standards)
references for several of them, but the current environment requires
the corresponding European standards as well, and I'll need assistance.
But there's also been progress!  RFC 1149 (Avian Carriers) has been 
implemented,
though scalable implementation and dual-source requirements may require
genetic reconstruction of the Passenger Pigeon to supplement current 
carrier species.
Modular methods uses standard data storage formats and separate transmission.
Standards widely supported in the US include Hollerith cards,
1600 bpi 9-track tapes with EBCDIC character sets and fixed block sizes and 
LRECLs,
and ASCII-format punch tape (with country-specific standards for recycled 
paper content.)
8-inch floppy disks have also been widely used, and support both
CP/M and RT11 file system formats.
Are there corresponding European standards for data storage?
Transmission methods for data storage media include International
Postal Union standards for link layer and addressing formats and pricing,
though I'm not directly familiar with standards for shipping containers
where data encapsulation is required.

 Thanks;  Bill Stewart, [EMAIL PROTECTED]


From: Jon Zittrain [EMAIL PROTECTED]
Subject: Re: FC: Data retention scheme marches forward in European 
Parliament

I've written something opposing this at 
http://www.forbes.com/forbes/2002/0708/062.html.

Consider the range of proposals for unobtrusive but sweeping Internet 
monitoring. Most of them are doable as a technical matter, and all of them 
would be unnoticeable to us as we surf. Forbes columnist Peter Huber's 
idea is perhaps the most distilled version. Call it the return of the lock 
box. He asks for massive government data vaults, routinely receiving 
copies of all Internet traffic--e-mails, Web pages, chats, mouse clicks, 
shopping, pirated music--for later retrieval should the government decide 
it needs more information to solve a heinous crime. (See the Nov. 12 
column at forbes.com/huber.)

The idea might sound innocuous because the data collected would remain 
unseen by prying eyes until a later search, commenced only after legal 
process, is thought to require it. Make no mistake, however: The idealized 
digital lock box and many sibling proposals are fundamentally terrible 
ideas. Why?


Jonathan Zittrain, Harvard law professor; codirector, Berkman Center for 
Internet  Society.




RE: Revenge of the WAVEoids: Palladium Clues May Lie In AMD Motherboard Design

2002-07-06 Thread Bill Stewart

At 10:07 PM 06/26/2002 -0700, Lucky Green wrote:
An EMBASSY-like CPU security co-processor would have seriously blown the
part cost design constraint on the TPM by an order of magnitude or two.

Compared to the cost of rewriting Windows to have a infrastructure
that can support real security?  Maybe, but I'm inclined to doubt it,
especially since most of the functions that an off-CPU security
co-processor can successfully perform are low enough performance that
they could be done on a PCI or PCMCIA card, without requiring motherboard 
space.
I suppose the interesting exception might be playing video,
depending on how you separate functions.

(Obviously the extent of redesign is likely to be much smaller in the
NT-derived Windows versions than the legacy Windows3.1 derivatives that
MS keeps foisting upon consumers.  Perhaps XP Amateur is close enough to
a real operating system for the kernel to be fixable?)

I am not asserting that security solutions that require special-purpose
CPU functionality are not in the queue, they very much are, but not in
the first phase. This level of functionality has been deferred to a
second phase in which security processing functionality can be moved
into the core CPU, since a second CPU-like part is unjustifiable from a
cost perspective.




Cracking Dead People's Passwords

2002-07-06 Thread Bill Stewart

One of the usual arguments for key escrow was always
what if your employee dies and you can't get his data?
Secret Sharing techniques are of course a better approach,
or at least storing sealed envelopes in company safes
as a much better approach than pre-broken crypto.
There've been a couple of stories in the press recently
where weak passwords also solved the problem.

One was a radio piece, I think NPR, about one of the companies
in the World Trade Center who'd lost their computer administrators
in the 9/11 attacks.  The remaining employees got together and
started telling stories about their co-workers - their interests,
their family members, where they'd gone on vacation, their dogs' names, etc.
They got most of the passwords.  (It was a piece about modern management
styles, and how in older hierarchical companies there'd be fewer
people who knew the new employees well enough to do that.)

The other was about the loss of the database of the personal
library collection of one of the main linguists studying one of
the two main Norwegian dialects.   It's now been cracked...

RISKS-FORUM Digest 22.13
  http://catless.ncl.ac.uk/Risks/22.13.html

Date: Tue, 11 Jun 2002 11:37:02 -0400
From: Lillie Coney [EMAIL PROTECTED]
Subject: Norwegian history database password lost and retrieved

After the password for accessing a Norwegian history museum's database
catalog for 11,000 books and manuscripts had been lost when the database's
steward died, the museum established a competition to recover it.  Joachim
Eriksson, a Swedish game company programmer, won the race to discover the
password (ladepujd, the reverse of the name of the researcher who had
created the database).  How he arrived at it was not disclosed.  [Source:
Long-lost password discovered: Norwegian history database cracked with help
from the Web, By Robert Lemos, MSNBC, 11 Jun 2002; PGN-ed]

Lillie Coney, Public Policy Coordinator, U.S. Association for Computing
Machinery Suite 510 2120 L Street, NW Washington, D.C. 20037 1-202-478-6124




Re: Ross's TCPA paper

2002-07-06 Thread Bill Stewart

At 09:43 PM 06/28/2002 +0200, Thomas Tydal wrote:
Well, first I want to say that I don't like the way it is today.
I want things to get better. I can't read e-books on my pocket computer,
for example, which is sad since I actually would be able to enjoy e-books
if I only could load them onto my small computer that follows my everywhere.

You may not be able to read an Adobe\(tm Brand E-Book\(tm,
but that just means you'll need to buy electronic books from
publishers that don't use that data format - whether it's
raw ascii text or Palm-formatted text or PalmOS DRMware that
you can also view on your PC using an emulator in glorious 160x160-pixel 
format :-)
Of course, if your PC's home country of Nauru has Software Police
implementing some local equivalent of the DMCA, that emulator
that you need for debugging may be illegal.

...
How good is Winamp if it can't play any music recorded in 2004 or later?
Given that Windows Media Player can play all your tunes and it takes a
reboot to switch to Winamp, who wouldn't stick with WMP?




Re: Closed source more secure than open source

2002-07-06 Thread Joseph Ashwood

- Original Message -
From: Anonymous [EMAIL PROTECTED]

 Ross Anderson's paper at
 http://www.ftp.cl.cam.ac.uk/ftp/users/rja14/toulouse.pdf
 has been mostly discussed for what it says about the TCPA.  But the
 first part of the paper is equally interesting.

Ross Andseron's approximate statements:
Closed Source:
 the system's failure rate has just
 dropped by a factor of L, just as we would expect.

Open Source:
bugs remain equally easy to find.

Anonymous's Statements:
For most programs, source code will be of
 no benefit to external testers, because they don't know how to program.

 Therefore the rate at which (external) testers find bugs does not vary
 by a factor of L between the open and closed source methodologies,
 as assumed in the model.  In fact the rates will be approximately equal.

 The result is that once a product has gone into beta testing and then into
 field installations, the rate of finding bugs by authorized testers will
 be low, decreased by a factor of L, regardless of open or closed source.

I disagree, actually I agree and disagree with both, due in part to the
magnitudes involved. It is certainly true that once Beta testing (or some
semblance of it) begins there will be users that cannot make use of source
code, but what Anonymous fails to realize is that there will be beta testers
that can make use of the source code.

Additionally there are certain tendencies in the open and closed source
communities that Anonymous and Anderson have not addressed in their models.
The most important tendencies are that in closed source beta testing is
generally handed off to a separate division and the original author does
little if any testing, and in open source the authors have a much stronger
connection with the testing, with the authors' duty extending through the
entire testing cycle. These tendencies lead to two very different positions
than generally realized.

First, closed source testing, beginning in the late Alpha testing stage, is
generally done without any assistance from source code, by _anyone_, this
significantly hampers the testing. This has led to observed situations where
QA engineers sign off on products that don't even function, let alone have
close to 0 bugs. With the software engineers believing that because the code
was signed off, it must be bug-free. This is a rather substantial problem.
To address this problem one must actually correct the number of testers for
the ones that are effectively doing nothing. So while L is the extra
difficulty in finding bugs without source code, it is magnified by something
approximating (testers)/(testers not doing anything). It's worth noting that
(testers)  (testers not doing anything) causing the result K =
L*(testers)/(testers not doing anything), to tend towards infinite values.

In open source we have very much the opposite situation. The authors are
involved in all stages of testing, giving another value. This value is used
to adjust L as before, but the quantities involved are substantially
different. It must be observed, as was done by Anonymous, that there are
testers that have no concept what source code is, and certainly no idea how
to read it, call these harassers. In addition though there are also testers
who read source code, and even the authors themselves are doing testing,
call these coders. So in this case K = L*(harassers)/(harassers+coders).
Where it's worth noting that K will now tend towards 0.

It is also very much the case that different projects have different
quantities of testers. In fact as the number of beta testers grows, the
MTBD(iscovery) of a bug must not increase, and will almost certainly
decrease. In this case each project must be treated separately, since
obviously WindowsXP will have more people testing it (thanks to bug
reporting features) than QFighter3
(http://sourceforge.net/projects/qfighter3/ the lest active development on
sourceforge). This certainly leads to problems in comparison. It is also
worth noting that it is likely that actual difficulty in locating bugs is
probably related to the maximum of (K/testers) and the (testers root of K).
Meaning that WindowsXP is likely to have a higher ratio of bugs uncovered in
a given time period T than QFighter3. However due to the complexity of the
comparisons, QFighter3 is likely to have fewer bugs than WindowsXP, simply
because WindowsXP is several orders of magnitude more complex.

So while the belief that source code makes bug hunting easier on everyone,
is certainly not purely the case (Anonymous's observation), it is also not
the case that the tasks are equivalent (Anonymous's claim), with the
multiplier in closed source approaching infinite, and open source towards 0.
Additionally the quantity of testers appears to have more of an impact on
bug-finding than the discussion of open or closed source. However as always
complexity plays an enormous role in the number of bugs available to find,
anybody with a few days programming experience 

TPM cost constraint [was: RE: Revenge of the WAVEoid]

2002-07-06 Thread Lucky Green

Bill wrote:
 At 10:07 PM 06/26/2002 -0700, Lucky Green wrote:
 An EMBASSY-like CPU security co-processor would have seriously blown 
 the part cost design constraint on the TPM by an order of 
 magnitude or 
 two.
 
 Compared to the cost of rewriting Windows to have a 
 infrastructure that can support real security?  Maybe, but 
 I'm inclined to doubt it, especially since most of the 
 functions that an off-CPU security co-processor can 
 successfully perform are low enough performance that they 
 could be done on a PCI or PCMCIA card, without requiring motherboard 
 space.

Upon re-reading the paragraph I wrote, I can see how the text might have
been ambiguous. I was trying to express that there was a cost constraint
on the part. Adding the cost of an EMBASSY or SEE environment to the
purchase of every new PC is more than the market for bare-bones or even
mid-range PC's will bear.

--Lucky