Re: IBM Lost Tape(s)

2007-06-11 Thread Florian Weimer
* John Ioannidis:

 I wonder how much it cost them to find current addresses for
 everybody so we could be notified.

I guess it's pretty easy because your personal information is
available to so many organizations, without any safeguards.
Obviously, they had your social security number (it's only the backup
that was lost), so they could work from that.

And more data is being collected: If you participate in their
monitoring program, Kroll can associate an email address with your
SSN, which is probably something that wasn't possible before.

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: Why self describing data formats:

2007-06-11 Thread Anne Lynn Wheeler

James A. Donald wrote:
Many protocols use some form of self describing data format, for example 
ASN.1, XML, S expressions, and bencoding.


Why?


gml (precursor to sgml, html, xml, etc) 
http://www.garlic.com/~lynn/subtopic.html#sgml


was invented at the science center in 1969 
http://www.garlic.com/~lynn/subtopic.html#545tech


... some recent (science center) topic drift/references in this post
http://www.garlic.com/~lynn/2007l.html#65 mainframe = superserver

G, M,  L were individuals at the science center ... so the
requirement was to come up with an acronym from the inventors initials

so some of the historical justification for the original markup language 
paradigm
can be found 


originally CMS had the script command for document formating ... using
dot format commands ... i.e. science center on 4th flr of 545 tech sq
doing virtual machines, cp67, cms, the internal network, etc ... and multics
on 5th flr of 545 tech sq ... draw from some common heritage to CTSS (and some
of the unix heritage traces back thru multics also to CTSS).

the original GML was sort of a combination of self-describing data (somewhat 
for
legal documents) 
http://www.sgmlsource.com/history/roots.htm

http://xml.coverpages.org//sgmlhist0.html

and document formating ... when GML tag formating was added to CMS script processing 
command. Later you find a big CMS installation at CERN ... and HTML drawing heritage 
from the waterloo clone of the CMS script command.

http://infomesh.net/html/history/early

first webserver in the states was at slac (a CERN sister location) ... another 
big vm/cms installation:

http://www.slac.stanford.edu/history/earlyweb/history.shtml

recent historical post/reference
http://www.garlic.com/~lynn/2007d.html#29 old tapes

last time i checked, w3c hdqtrs was around the corner from the old
science center location at 545 tech. sq.

before GML, the science center had an activity involving performance data
from the time-sharing service (originally using virtual machine cp67 service
and then transitioning to vm370) ... lots of system activity data was captured
every 5-10 minutes and then archived to tape ... starting in the mid-60s ...
by the mid-70s there was a decade of data spanning lots of different 
configurations,
workloads, etc. The original intention when the system activity data was being
archived was to include enuf self-describing information that the data could
be interpreted many yrs later. lots of past posts about using cp67vm370
for time-sharing services (both for internal corporate use and customers 
offering
commercial, online time-sharing services using the platform)
http://www.garlic.com/~lynn/subtopic.html#timeshare

lots of past posts about long term performance monitoring, workload profiling,
benchmarking and stuff leading up to things like capacity planning
http://www.garlic.com/~lynn/subtopic.html#benchmark

much later, you find things like ASN.1 encoding for handling interoperability
of network transmitted data between platforms that might have different
information representation conventions (like the whole little/big endian stuff).

one of the things swirling around digital signature activity in the mid-90s
was almost religious belief that digital certificate encoding mandated 
ASN.1. 

other digital signature operations that were less religious about PKI, 
x.509 identity digital certificates, etc ... were much less strict

about encoding technique for digitally signed operations ... included
certificateless digital signature infrastructures
http://www.garlic.com/~lynn/subpubkey.html#certless

One of the battles during the period between XML and ASN.1 proponents
during the period was that XML didn't provide for a deterministic encoding.
It really was somewhat a red herring on the digital certificate ... ASN.1
side ... since they were looking at always keeping things ASN.1 encoded
(not just for transmission) ... and only decoding when some specific 
information needed extraction.


On the other side was places like FSTC which was defining digitally
signed electronic check convention (with tranmission over ACH or ISO8583).
There was already a transmission standard ... which ASN.1 encoding would
severely bloat ... not to mention the horrible payload bloat that was
the result of any certificate-based infrastructure needing to append
redundand and superfluous digital certificates.

FSTC just defined appending a digital signature to existing payload.
The issue then became a deterministic encoding of the information
for when the digital signature was generated and verified. If you
temporarily encoded the payload as XML, generated the digital signature
... and then appended the digital signature to the standard (ACH or
ISO8583) payload ... the problem was that at the other end,
XML didn't provide a deterministic encoding methodology so that
the recipient could re-encode the payload and verify the digital
signature. So FSTC eventually defined some additional rules for
XML called 

Re: Why self describing data formats:

2007-06-11 Thread Anne Lynn Wheeler


re:
http://www.garlic.com/~lynn/aadsm27.htm#24 Why self describing data formats:

for other archaeological trivia ... later i transferred from the science center
to SJR and got to do some of the work on the original relational/sql 
implementation,
System/R.

a few years later, the L in GML also transferred to SJR and worked on 
relational,
included being involved in the development of of BLOBS (Binary Large OBjectS) 
for relational.


roll forward a few yrs to the acm (database) sigmod conference in san jose in
the early 90s. In one of the sessions, somebody raised the question about what
was all this X.500 and X.509 stuff going on in ISO ... and there was somebody
from the audience that explained how it was a bunch of networking engineers 
trying to re-invent 1960s database technology.


today ... you can periodically find heated online discussion about XML 
databases
and whether they compromise the purity of information integrity that you get
from the relational paradigm. lots of past posts mentioning various things about
system/r, relational database technology, etc
http://www.garlic.com/~lynn/subtopic.html#systemr


-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: A crazy thought?

2007-06-11 Thread Anne Lynn Wheeler

re:
http://www.garlic.com/~lynn/aadsm27.htm#22 A crazy thought?

for some other topic drift regarding certification authorities ... having been 
certification
authorities for digital certificates targeted at the (electronic but) 
offline market
... they encountered a number of issues in the mid-90s as the world was 
transitioning
to ubiquitous online operation ... the digital certificates were somewhat 
targeted for
relying parties ... dealing with total strangers (that they had no prior 
information
about) and had no timely mechanisms for directly contacting any authorities for
references regarding the stranger.

so one of the issues for x.509 identity certificates ... small x-over from this
other thread
http://www.garlic.com/~lynn/aadsm27.htm#25 Why self describing data formats

was to try and move out of the no-value market into the identity market ... aka 
...
as world transitioned to ubiquitous online operation ... the remaining offline
was no-value situations where the relying-party couldn't justify the cost of
maintaining information about the parties that they dealt with (aka something
analogous to browser cookies) and/or couldn't justify the cost of directly
contacting responsible agencies for information about the parties they were 
deailing
with.

now in this recent thread ... somewhat about some internet historical 
evolution

http://www.garlic.com/~lynn/2007l.html#67 nouns and adjectives
http://www.garlic.com/~lynn/2007l.html#68 nouns and adjectives
http://www.garlic.com/~lynn/2007l.html#69 nouns and adjectives
http://www.garlic.com/~lynn/2007l.html#70 nouns and adjectives

the last posts drifts into the subject of some of the recent churn around
identity activities ... also lengthy post on the subject here:
http://www.garlic.com/~lynn/aadsm27.htm#23 Identity resurges as a debate topic

the certification authorities were somewhat looking at increasing the
value of x.509 identity digital certificates (since there wasn't a lot
of future selling into the no-value market segment) by starting to
grossly overload the digital certificates with enormous amounts of
personal information.

now typically identity has been a authentication characteristic ...
adding potentially enormous amounts of personal information could be considered 
attempting to move into the authorization area ... where a relying-party might

be able to make a authorization, approval, and/or permission decision purely 
based
on the additional personal information in the digital certificate.

what was seen by the mid-90s was that many of the institutions were
starting to realize that x.509 identity digital certificates, grossly
overloaded with personal information represented significant privacy
and liability issues. what you saw then was a retrenchment to purely
authentication, relying-party-only digital certificate
http://www.garlic.com/~lynn/subpubkey.html#rpo

with the digital certificate containing little more than a record
locator (where all the necessary information was actually kept, even real-time,
and aggregated information ... which is difficult to achieve in a stale,
static digital certificate paradigm) and a public key ... note, however, 
we could  trivially show that in such situations the stale, static digital 
certificate was redundant and superfluous ... aka just add the public key to the

entity's record ... which already had all the personal, private and
other information necessary for authorization. in the payments
market segment ... this is somewhat separate from the fact that
the appended stale, static, redundant, and superfluous digital
certificates were causing a factor of 100 times payload and processing
bloat
http://www.garlic.com/~lynn/subpubkey.html#bloat

one of the other problems faced by certification authorities attempting
to move identity digital certificates into the authorization market
segment was what (with loads of personal information), if any, liability 
were certification authorities going to accept with regard to authorization 
problems encountered by the relying-parties (depending on the digital

certificate personal information in their decision making process).

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: A crazy thought?

2007-06-11 Thread Anne Lynn Wheeler

Ian G wrote:
What you are suggesting is called Web of Trust (WoT). That's what the 
PGP world does, more or less, and I gather that the SPKI concept 
includes it, too.


However, x.509 does not support it.  There is no easy way to add 
multiple signatures to an x.509 certificate without running into support 
problems (that is, of course you can hack it in, but browsers won't 
understand it, and developers won't support you).


re:
http://www.garlic.com/~lynn/aadsm27.htm#22 A crazy thought?
http://www.garlic.com/~lynn/aadsm27.htm#26 A crazy thought?
http://www.garlic.com/~lynn/aadsm27.htm#27 A crazy thought?

actually ... at a very fundamental level both PKI and PGP have nearly identical
business and implementation processes ... the difference is somewhat that the 
PKI operations tend to try and make out that their structure is more formal 
... and therefor should be more trusted.


Both implementations require that the relying-parties have some sort of local
trusted public key repository ... initially populated with some out-of-bad
process. In the SSL PKI scenario ... there tends to be a trusted public key
repository built into each browser distributed ... initially loaded with
possibly 40-50 public keys that you are told that you can trust. In the
web of trust scenario ... there tend to be some set of public keys
that are also trusted and have also been acquired in some out-of-band process.

In both scenarios ... the relying-party is expected to trust new public keys
that carry digital signatures ... where these digital signatures can be
verified with public keys from their local repository of (already) trusted
public keys (public keys that have typically been distributed/initialized
by some out-of-band process)

It isn't so much that the fundamental processes are different ... it
is more about how tightly controlled and cast in concrete the surrounding
pieces happen to be (aka formalized and not easily changed/adapted).

For totally other drift ... one of the places we came up with requirement
for multiple digital signatures was in the process for x9.59 financial
infrastructure for payment transactions ... i.e. in the mid-90s, the
x9a10 financial standard working group had been given the requirement
to preserve the integrity of the financial infrastructure for all retail
payments
http://www.garlic.com/~lynn/x959.html#x959

x9.59 actually doesn't specify the details of digital signature process ...
but defines the fields necessary for a payment transactions which require
authentication and integrity protection on end-to-end basis. one of the
scenarios is the authentication of the account holder with digital
signature (which also provides payment transaction integrity). one of
the trust issues was that their could be various kinds of exploits
at the originating environment (where the account holder's digital
signature and the transaction was otherwise valid). to increase the
trust (as indication of possible countermeasures against these additional
exploits/vulnerabilities) allowed for the originating environment to
also digitally sign the transaction (as a flavor of device authentication,
possibly a point-of-sale terminal or other kind of device that was
used to originate the transaction).

the FSTC electronic check work also allowed for multiple digital signatures
... representing the equivalent of requiring multiple co-signers on
business checks ... i.e. business checks that allow for single signer
if the amount is below some limit ... but requires additional co-signers
for larger amounts.

note that both in the FSTC electronic check and the X9.59 financial
standard scenario, there was some assumption that the basic transaction 
went via normal existing electronic payment networks ... with appended digital

signature(s) ... where the transaction might actually only be encoded
during just the digital signature generation and digital signature verification
processes. recent posts in the encoding thread:
http://www.garlic.com/~lynn/aadsm27.htm#24 Why self describing data formats:
http://www.garlic.com/~lynn/aadsm27.htm#25 Why self describing data formats:

also any additional appending of traditional digital certificates to such
operations could represent a factor of 100 times payload and processing
bloat.
http://www.garlic.com/~lynn/subpubkey.html#bloat

-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]


Re: Free Rootkit with Every New Intel Machine

2007-06-11 Thread James A. Donald

Initially I did not believe it, thought it must be hype or hoax.

Nope, it is a rootkit in hardware.

http://www.intel.com/business/vpro/index.htm

: : Isolate security tasks—in a separate
: : environment that is hidden to the user
: :
: : [...]
: :
: : Perform hardware and software inventory on
: : PCs—even if they don't have management
: : applications installed or they are powered
: : down, which increases reporting accuracy for
: : licensing, maintenance contracts, and audits.
: :
: : Deploy software patches to PCs more
: : efficiently—even if they are powered down or
: : their OS is inoperable, without disrupting or
: : slowing down the user's workflow.

(The last paragraph means without the user knowing, and even if the 
user is doing his best to stop you)


-
The Cryptography Mailing List
Unsubscribe by sending unsubscribe cryptography to [EMAIL PROTECTED]